00:00:00.000 Started by upstream project "autotest-per-patch" build number 126230 00:00:00.000 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.009 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.009 The recommended git tool is: git 00:00:00.010 using credential 00000000-0000-0000-0000-000000000002 00:00:00.012 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.027 Fetching changes from the remote Git repository 00:00:00.031 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.045 Using shallow fetch with depth 1 00:00:00.045 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.045 > git --version # timeout=10 00:00:00.061 > git --version # 'git version 2.39.2' 00:00:00.062 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.094 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.094 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.301 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.313 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.326 Checking out Revision 7caca6989ac753a10259529aadac5754060382af (FETCH_HEAD) 00:00:02.326 > git config core.sparsecheckout # timeout=10 00:00:02.336 > git read-tree -mu HEAD # timeout=10 00:00:02.352 > git checkout -f 7caca6989ac753a10259529aadac5754060382af # timeout=5 00:00:02.373 Commit message: "jenkins/jjb-config: Purge centos leftovers" 00:00:02.373 > git rev-list --no-walk 7caca6989ac753a10259529aadac5754060382af # timeout=10 00:00:02.519 [Pipeline] Start of Pipeline 00:00:02.535 [Pipeline] library 00:00:02.537 Loading library shm_lib@master 00:00:02.537 Library shm_lib@master is cached. Copying from home. 00:00:02.562 [Pipeline] node 00:00:02.580 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.583 [Pipeline] { 00:00:02.595 [Pipeline] catchError 00:00:02.596 [Pipeline] { 00:00:02.609 [Pipeline] wrap 00:00:02.617 [Pipeline] { 00:00:02.624 [Pipeline] stage 00:00:02.626 [Pipeline] { (Prologue) 00:00:02.799 [Pipeline] sh 00:00:03.085 + logger -p user.info -t JENKINS-CI 00:00:03.100 [Pipeline] echo 00:00:03.101 Node: WFP50 00:00:03.110 [Pipeline] sh 00:00:03.404 [Pipeline] setCustomBuildProperty 00:00:03.416 [Pipeline] echo 00:00:03.417 Cleanup processes 00:00:03.423 [Pipeline] sh 00:00:03.703 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.703 1186931 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.737 [Pipeline] sh 00:00:04.032 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.032 ++ grep -v 'sudo pgrep' 00:00:04.032 ++ awk '{print $1}' 00:00:04.032 + sudo kill -9 00:00:04.032 + true 00:00:04.047 [Pipeline] cleanWs 00:00:04.056 [WS-CLEANUP] Deleting project workspace... 00:00:04.056 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.062 [WS-CLEANUP] done 00:00:04.065 [Pipeline] setCustomBuildProperty 00:00:04.077 [Pipeline] sh 00:00:04.356 + sudo git config --global --replace-all safe.directory '*' 00:00:04.437 [Pipeline] httpRequest 00:00:04.452 [Pipeline] echo 00:00:04.454 Sorcerer 10.211.164.101 is alive 00:00:04.460 [Pipeline] httpRequest 00:00:04.464 HttpMethod: GET 00:00:04.464 URL: http://10.211.164.101/packages/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:04.464 Sending request to url: http://10.211.164.101/packages/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:04.467 Response Code: HTTP/1.1 200 OK 00:00:04.467 Success: Status code 200 is in the accepted range: 200,404 00:00:04.467 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:05.164 [Pipeline] sh 00:00:05.444 + tar --no-same-owner -xf jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:05.717 [Pipeline] httpRequest 00:00:05.729 [Pipeline] echo 00:00:05.730 Sorcerer 10.211.164.101 is alive 00:00:05.736 [Pipeline] httpRequest 00:00:05.740 HttpMethod: GET 00:00:05.740 URL: http://10.211.164.101/packages/spdk_6c0846996bb393be04189626d69239816f169775.tar.gz 00:00:05.740 Sending request to url: http://10.211.164.101/packages/spdk_6c0846996bb393be04189626d69239816f169775.tar.gz 00:00:05.753 Response Code: HTTP/1.1 200 OK 00:00:05.753 Success: Status code 200 is in the accepted range: 200,404 00:00:05.754 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_6c0846996bb393be04189626d69239816f169775.tar.gz 00:00:53.268 [Pipeline] sh 00:00:53.552 + tar --no-same-owner -xf spdk_6c0846996bb393be04189626d69239816f169775.tar.gz 00:00:57.769 [Pipeline] sh 00:00:58.058 + git -C spdk log --oneline -n5 00:00:58.058 6c0846996 module/bdev/nvme: add detach-monitor poller 00:00:58.058 70e80ba15 lib/nvme: add scan attached 00:00:58.058 455fda465 nvme_pci: ctrlr_scan_attached callback 00:00:58.058 a732bf2a5 nvme_transport: optional callback to scan attached 00:00:58.058 2728651ee accel: adjust task per ch define name 00:00:58.075 [Pipeline] } 00:00:58.095 [Pipeline] // stage 00:00:58.105 [Pipeline] stage 00:00:58.108 [Pipeline] { (Prepare) 00:00:58.129 [Pipeline] writeFile 00:00:58.151 [Pipeline] sh 00:00:58.438 + logger -p user.info -t JENKINS-CI 00:00:58.452 [Pipeline] sh 00:00:58.737 + logger -p user.info -t JENKINS-CI 00:00:58.752 [Pipeline] sh 00:00:59.038 + cat autorun-spdk.conf 00:00:59.038 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:59.038 SPDK_TEST_BLOCKDEV=1 00:00:59.038 SPDK_TEST_ISAL=1 00:00:59.038 SPDK_TEST_CRYPTO=1 00:00:59.038 SPDK_TEST_REDUCE=1 00:00:59.038 SPDK_TEST_VBDEV_COMPRESS=1 00:00:59.038 SPDK_RUN_UBSAN=1 00:00:59.046 RUN_NIGHTLY=0 00:00:59.052 [Pipeline] readFile 00:00:59.083 [Pipeline] withEnv 00:00:59.086 [Pipeline] { 00:00:59.101 [Pipeline] sh 00:00:59.391 + set -ex 00:00:59.391 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:59.391 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:59.391 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:59.391 ++ SPDK_TEST_BLOCKDEV=1 00:00:59.391 ++ SPDK_TEST_ISAL=1 00:00:59.391 ++ SPDK_TEST_CRYPTO=1 00:00:59.391 ++ SPDK_TEST_REDUCE=1 00:00:59.391 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:59.391 ++ SPDK_RUN_UBSAN=1 00:00:59.391 ++ RUN_NIGHTLY=0 00:00:59.391 + case $SPDK_TEST_NVMF_NICS in 00:00:59.391 + DRIVERS= 00:00:59.391 + [[ -n '' ]] 00:00:59.391 + exit 0 00:00:59.402 [Pipeline] } 00:00:59.427 [Pipeline] // withEnv 00:00:59.434 [Pipeline] } 00:00:59.456 [Pipeline] // stage 00:00:59.468 [Pipeline] catchError 00:00:59.470 [Pipeline] { 00:00:59.490 [Pipeline] timeout 00:00:59.490 Timeout set to expire in 40 min 00:00:59.493 [Pipeline] { 00:00:59.514 [Pipeline] stage 00:00:59.517 [Pipeline] { (Tests) 00:00:59.540 [Pipeline] sh 00:00:59.826 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:59.826 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:59.826 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:59.826 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:59.826 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:59.826 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:59.826 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:59.826 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:59.826 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:59.826 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:59.826 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:59.827 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:59.827 + source /etc/os-release 00:00:59.827 ++ NAME='Fedora Linux' 00:00:59.827 ++ VERSION='38 (Cloud Edition)' 00:00:59.827 ++ ID=fedora 00:00:59.827 ++ VERSION_ID=38 00:00:59.827 ++ VERSION_CODENAME= 00:00:59.827 ++ PLATFORM_ID=platform:f38 00:00:59.827 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:59.827 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:59.827 ++ LOGO=fedora-logo-icon 00:00:59.827 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:59.827 ++ HOME_URL=https://fedoraproject.org/ 00:00:59.827 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:59.827 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:59.827 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:59.827 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:59.827 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:59.827 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:59.827 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:59.827 ++ SUPPORT_END=2024-05-14 00:00:59.827 ++ VARIANT='Cloud Edition' 00:00:59.827 ++ VARIANT_ID=cloud 00:00:59.827 + uname -a 00:00:59.827 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:59.827 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:03.125 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:01:03.125 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:01:03.125 Hugepages 00:01:03.125 node hugesize free / total 00:01:03.125 node0 1048576kB 0 / 0 00:01:03.125 node0 2048kB 0 / 0 00:01:03.125 node1 1048576kB 0 / 0 00:01:03.125 node1 2048kB 0 / 0 00:01:03.125 00:01:03.125 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:03.125 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:03.125 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:03.125 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:03.125 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:03.125 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:03.125 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:03.125 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:03.125 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:03.125 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:01:03.125 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:03.125 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:03.125 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:03.125 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:03.125 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:03.125 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:03.125 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:03.125 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:03.125 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:01:03.125 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:01:03.385 + rm -f /tmp/spdk-ld-path 00:01:03.385 + source autorun-spdk.conf 00:01:03.385 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:03.385 ++ SPDK_TEST_BLOCKDEV=1 00:01:03.385 ++ SPDK_TEST_ISAL=1 00:01:03.385 ++ SPDK_TEST_CRYPTO=1 00:01:03.385 ++ SPDK_TEST_REDUCE=1 00:01:03.385 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:03.385 ++ SPDK_RUN_UBSAN=1 00:01:03.385 ++ RUN_NIGHTLY=0 00:01:03.385 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:03.385 + [[ -n '' ]] 00:01:03.385 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:03.385 + for M in /var/spdk/build-*-manifest.txt 00:01:03.385 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:03.385 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:03.385 + for M in /var/spdk/build-*-manifest.txt 00:01:03.385 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:03.385 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:03.385 ++ uname 00:01:03.386 + [[ Linux == \L\i\n\u\x ]] 00:01:03.386 + sudo dmesg -T 00:01:03.386 + sudo dmesg --clear 00:01:03.386 + dmesg_pid=1187903 00:01:03.386 + [[ Fedora Linux == FreeBSD ]] 00:01:03.386 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:03.386 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:03.386 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:03.386 + [[ -x /usr/src/fio-static/fio ]] 00:01:03.386 + export FIO_BIN=/usr/src/fio-static/fio 00:01:03.386 + FIO_BIN=/usr/src/fio-static/fio 00:01:03.386 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:03.386 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:03.386 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:03.386 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:03.386 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:03.386 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:03.386 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:03.386 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:03.386 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:03.386 + sudo dmesg -Tw 00:01:03.386 Test configuration: 00:01:03.386 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:03.386 SPDK_TEST_BLOCKDEV=1 00:01:03.386 SPDK_TEST_ISAL=1 00:01:03.386 SPDK_TEST_CRYPTO=1 00:01:03.386 SPDK_TEST_REDUCE=1 00:01:03.386 SPDK_TEST_VBDEV_COMPRESS=1 00:01:03.386 SPDK_RUN_UBSAN=1 00:01:03.386 RUN_NIGHTLY=0 20:14:55 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:03.386 20:14:55 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:03.386 20:14:55 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:03.386 20:14:55 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:03.386 20:14:55 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.386 20:14:55 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.386 20:14:55 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.386 20:14:55 -- paths/export.sh@5 -- $ export PATH 00:01:03.386 20:14:55 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:03.386 20:14:55 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:03.386 20:14:55 -- common/autobuild_common.sh@444 -- $ date +%s 00:01:03.647 20:14:55 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721067295.XXXXXX 00:01:03.647 20:14:55 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721067295.n2MtV8 00:01:03.647 20:14:55 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:01:03.647 20:14:55 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:01:03.647 20:14:55 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:03.647 20:14:55 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:03.647 20:14:55 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:03.647 20:14:55 -- common/autobuild_common.sh@460 -- $ get_config_params 00:01:03.647 20:14:55 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:03.647 20:14:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:03.647 20:14:55 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:03.647 20:14:55 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:01:03.647 20:14:55 -- pm/common@17 -- $ local monitor 00:01:03.647 20:14:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.647 20:14:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.647 20:14:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.647 20:14:55 -- pm/common@21 -- $ date +%s 00:01:03.647 20:14:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:03.647 20:14:55 -- pm/common@21 -- $ date +%s 00:01:03.647 20:14:55 -- pm/common@21 -- $ date +%s 00:01:03.647 20:14:55 -- pm/common@25 -- $ sleep 1 00:01:03.647 20:14:55 -- pm/common@21 -- $ date +%s 00:01:03.647 20:14:55 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721067295 00:01:03.647 20:14:55 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721067295 00:01:03.647 20:14:55 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721067295 00:01:03.647 20:14:55 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721067295 00:01:03.647 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721067295_collect-vmstat.pm.log 00:01:03.647 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721067295_collect-cpu-load.pm.log 00:01:03.647 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721067295_collect-cpu-temp.pm.log 00:01:03.647 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721067295_collect-bmc-pm.bmc.pm.log 00:01:04.588 20:14:56 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:01:04.588 20:14:56 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:04.588 20:14:56 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:04.588 20:14:56 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:04.588 20:14:56 -- spdk/autobuild.sh@16 -- $ date -u 00:01:04.588 Mon Jul 15 06:14:56 PM UTC 2024 00:01:04.589 20:14:56 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:04.589 v24.09-pre-210-g6c0846996 00:01:04.589 20:14:56 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:04.589 20:14:56 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:04.589 20:14:56 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:04.589 20:14:56 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:04.589 20:14:56 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:04.589 20:14:56 -- common/autotest_common.sh@10 -- $ set +x 00:01:04.589 ************************************ 00:01:04.589 START TEST ubsan 00:01:04.589 ************************************ 00:01:04.589 20:14:56 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:01:04.589 using ubsan 00:01:04.589 00:01:04.589 real 0m0.001s 00:01:04.589 user 0m0.001s 00:01:04.589 sys 0m0.000s 00:01:04.589 20:14:56 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:01:04.589 20:14:56 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:04.589 ************************************ 00:01:04.589 END TEST ubsan 00:01:04.589 ************************************ 00:01:04.589 20:14:56 -- common/autotest_common.sh@1142 -- $ return 0 00:01:04.589 20:14:56 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:04.589 20:14:56 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:04.589 20:14:56 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:04.589 20:14:56 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:04.589 20:14:56 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:04.589 20:14:56 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:04.589 20:14:56 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:04.589 20:14:56 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:04.589 20:14:56 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:04.848 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:04.848 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:05.419 Using 'verbs' RDMA provider 00:01:21.752 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:36.635 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:36.635 Creating mk/config.mk...done. 00:01:36.635 Creating mk/cc.flags.mk...done. 00:01:36.635 Type 'make' to build. 00:01:36.635 20:15:28 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:36.635 20:15:28 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:36.635 20:15:28 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:36.635 20:15:28 -- common/autotest_common.sh@10 -- $ set +x 00:01:36.635 ************************************ 00:01:36.635 START TEST make 00:01:36.635 ************************************ 00:01:36.635 20:15:28 make -- common/autotest_common.sh@1123 -- $ make -j72 00:01:36.635 make[1]: Nothing to be done for 'all'. 00:02:15.387 The Meson build system 00:02:15.387 Version: 1.3.1 00:02:15.387 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:15.387 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:15.387 Build type: native build 00:02:15.387 Program cat found: YES (/usr/bin/cat) 00:02:15.387 Project name: DPDK 00:02:15.387 Project version: 24.03.0 00:02:15.387 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:15.387 C linker for the host machine: cc ld.bfd 2.39-16 00:02:15.387 Host machine cpu family: x86_64 00:02:15.387 Host machine cpu: x86_64 00:02:15.387 Message: ## Building in Developer Mode ## 00:02:15.387 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:15.387 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:15.387 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:15.387 Program python3 found: YES (/usr/bin/python3) 00:02:15.387 Program cat found: YES (/usr/bin/cat) 00:02:15.387 Compiler for C supports arguments -march=native: YES 00:02:15.387 Checking for size of "void *" : 8 00:02:15.387 Checking for size of "void *" : 8 (cached) 00:02:15.387 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:15.387 Library m found: YES 00:02:15.387 Library numa found: YES 00:02:15.387 Has header "numaif.h" : YES 00:02:15.387 Library fdt found: NO 00:02:15.387 Library execinfo found: NO 00:02:15.387 Has header "execinfo.h" : YES 00:02:15.387 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:15.387 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:15.387 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:15.387 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:15.387 Run-time dependency openssl found: YES 3.0.9 00:02:15.387 Run-time dependency libpcap found: YES 1.10.4 00:02:15.387 Has header "pcap.h" with dependency libpcap: YES 00:02:15.387 Compiler for C supports arguments -Wcast-qual: YES 00:02:15.387 Compiler for C supports arguments -Wdeprecated: YES 00:02:15.387 Compiler for C supports arguments -Wformat: YES 00:02:15.387 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:15.387 Compiler for C supports arguments -Wformat-security: NO 00:02:15.387 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:15.387 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:15.387 Compiler for C supports arguments -Wnested-externs: YES 00:02:15.387 Compiler for C supports arguments -Wold-style-definition: YES 00:02:15.387 Compiler for C supports arguments -Wpointer-arith: YES 00:02:15.387 Compiler for C supports arguments -Wsign-compare: YES 00:02:15.387 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:15.387 Compiler for C supports arguments -Wundef: YES 00:02:15.387 Compiler for C supports arguments -Wwrite-strings: YES 00:02:15.387 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:15.387 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:15.387 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:15.387 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:15.387 Program objdump found: YES (/usr/bin/objdump) 00:02:15.387 Compiler for C supports arguments -mavx512f: YES 00:02:15.387 Checking if "AVX512 checking" compiles: YES 00:02:15.387 Fetching value of define "__SSE4_2__" : 1 00:02:15.387 Fetching value of define "__AES__" : 1 00:02:15.387 Fetching value of define "__AVX__" : 1 00:02:15.387 Fetching value of define "__AVX2__" : 1 00:02:15.387 Fetching value of define "__AVX512BW__" : 1 00:02:15.387 Fetching value of define "__AVX512CD__" : 1 00:02:15.387 Fetching value of define "__AVX512DQ__" : 1 00:02:15.387 Fetching value of define "__AVX512F__" : 1 00:02:15.387 Fetching value of define "__AVX512VL__" : 1 00:02:15.387 Fetching value of define "__PCLMUL__" : 1 00:02:15.387 Fetching value of define "__RDRND__" : 1 00:02:15.387 Fetching value of define "__RDSEED__" : 1 00:02:15.387 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:15.387 Fetching value of define "__znver1__" : (undefined) 00:02:15.387 Fetching value of define "__znver2__" : (undefined) 00:02:15.387 Fetching value of define "__znver3__" : (undefined) 00:02:15.387 Fetching value of define "__znver4__" : (undefined) 00:02:15.387 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:15.387 Message: lib/log: Defining dependency "log" 00:02:15.387 Message: lib/kvargs: Defining dependency "kvargs" 00:02:15.387 Message: lib/telemetry: Defining dependency "telemetry" 00:02:15.387 Checking for function "getentropy" : NO 00:02:15.387 Message: lib/eal: Defining dependency "eal" 00:02:15.387 Message: lib/ring: Defining dependency "ring" 00:02:15.387 Message: lib/rcu: Defining dependency "rcu" 00:02:15.387 Message: lib/mempool: Defining dependency "mempool" 00:02:15.387 Message: lib/mbuf: Defining dependency "mbuf" 00:02:15.387 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:15.387 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:15.387 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:15.387 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:15.387 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:15.387 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:15.387 Compiler for C supports arguments -mpclmul: YES 00:02:15.387 Compiler for C supports arguments -maes: YES 00:02:15.387 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:15.387 Compiler for C supports arguments -mavx512bw: YES 00:02:15.387 Compiler for C supports arguments -mavx512dq: YES 00:02:15.387 Compiler for C supports arguments -mavx512vl: YES 00:02:15.387 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:15.387 Compiler for C supports arguments -mavx2: YES 00:02:15.387 Compiler for C supports arguments -mavx: YES 00:02:15.387 Message: lib/net: Defining dependency "net" 00:02:15.387 Message: lib/meter: Defining dependency "meter" 00:02:15.387 Message: lib/ethdev: Defining dependency "ethdev" 00:02:15.387 Message: lib/pci: Defining dependency "pci" 00:02:15.387 Message: lib/cmdline: Defining dependency "cmdline" 00:02:15.387 Message: lib/hash: Defining dependency "hash" 00:02:15.387 Message: lib/timer: Defining dependency "timer" 00:02:15.387 Message: lib/compressdev: Defining dependency "compressdev" 00:02:15.387 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:15.387 Message: lib/dmadev: Defining dependency "dmadev" 00:02:15.387 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:15.387 Message: lib/power: Defining dependency "power" 00:02:15.387 Message: lib/reorder: Defining dependency "reorder" 00:02:15.387 Message: lib/security: Defining dependency "security" 00:02:15.387 Has header "linux/userfaultfd.h" : YES 00:02:15.387 Has header "linux/vduse.h" : YES 00:02:15.387 Message: lib/vhost: Defining dependency "vhost" 00:02:15.388 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:15.388 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:15.388 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:15.388 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:15.388 Compiler for C supports arguments -std=c11: YES 00:02:15.388 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:15.388 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:15.388 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:15.388 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:15.388 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:15.388 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:15.388 Library mtcr_ul found: NO 00:02:15.388 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:15.388 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:15.388 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:15.388 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:19.583 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:19.584 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:19.584 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:19.584 Configuring mlx5_autoconf.h using configuration 00:02:19.584 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:19.584 Run-time dependency libcrypto found: YES 3.0.9 00:02:19.584 Library IPSec_MB found: YES 00:02:19.584 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:19.584 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:19.584 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:19.584 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:19.584 Library IPSec_MB found: YES 00:02:19.584 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:19.584 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:19.584 Compiler for C supports arguments -std=c11: YES (cached) 00:02:19.584 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:19.584 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:19.584 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:19.584 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:19.584 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:19.584 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:19.584 Library libisal found: NO 00:02:19.584 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:19.584 Compiler for C supports arguments -std=c11: YES (cached) 00:02:19.584 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:19.584 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:19.584 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:19.584 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:19.584 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:19.584 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:19.584 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:19.584 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:19.584 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:19.584 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:19.584 Program doxygen found: YES (/usr/bin/doxygen) 00:02:19.584 Configuring doxy-api-html.conf using configuration 00:02:19.584 Configuring doxy-api-man.conf using configuration 00:02:19.584 Program mandb found: YES (/usr/bin/mandb) 00:02:19.584 Program sphinx-build found: NO 00:02:19.584 Configuring rte_build_config.h using configuration 00:02:19.584 Message: 00:02:19.584 ================= 00:02:19.584 Applications Enabled 00:02:19.584 ================= 00:02:19.584 00:02:19.584 apps: 00:02:19.584 00:02:19.584 00:02:19.584 Message: 00:02:19.584 ================= 00:02:19.584 Libraries Enabled 00:02:19.584 ================= 00:02:19.584 00:02:19.584 libs: 00:02:19.584 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:19.584 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:19.584 cryptodev, dmadev, power, reorder, security, vhost, 00:02:19.584 00:02:19.584 Message: 00:02:19.584 =============== 00:02:19.584 Drivers Enabled 00:02:19.584 =============== 00:02:19.584 00:02:19.584 common: 00:02:19.584 mlx5, qat, 00:02:19.584 bus: 00:02:19.584 auxiliary, pci, vdev, 00:02:19.584 mempool: 00:02:19.584 ring, 00:02:19.584 dma: 00:02:19.584 00:02:19.584 net: 00:02:19.584 00:02:19.584 crypto: 00:02:19.584 ipsec_mb, mlx5, 00:02:19.584 compress: 00:02:19.584 isal, mlx5, 00:02:19.584 vdpa: 00:02:19.584 00:02:19.584 00:02:19.584 Message: 00:02:19.584 ================= 00:02:19.584 Content Skipped 00:02:19.584 ================= 00:02:19.584 00:02:19.584 apps: 00:02:19.584 dumpcap: explicitly disabled via build config 00:02:19.584 graph: explicitly disabled via build config 00:02:19.584 pdump: explicitly disabled via build config 00:02:19.584 proc-info: explicitly disabled via build config 00:02:19.584 test-acl: explicitly disabled via build config 00:02:19.584 test-bbdev: explicitly disabled via build config 00:02:19.584 test-cmdline: explicitly disabled via build config 00:02:19.584 test-compress-perf: explicitly disabled via build config 00:02:19.585 test-crypto-perf: explicitly disabled via build config 00:02:19.585 test-dma-perf: explicitly disabled via build config 00:02:19.585 test-eventdev: explicitly disabled via build config 00:02:19.585 test-fib: explicitly disabled via build config 00:02:19.585 test-flow-perf: explicitly disabled via build config 00:02:19.585 test-gpudev: explicitly disabled via build config 00:02:19.585 test-mldev: explicitly disabled via build config 00:02:19.585 test-pipeline: explicitly disabled via build config 00:02:19.585 test-pmd: explicitly disabled via build config 00:02:19.585 test-regex: explicitly disabled via build config 00:02:19.585 test-sad: explicitly disabled via build config 00:02:19.585 test-security-perf: explicitly disabled via build config 00:02:19.585 00:02:19.585 libs: 00:02:19.585 argparse: explicitly disabled via build config 00:02:19.585 metrics: explicitly disabled via build config 00:02:19.585 acl: explicitly disabled via build config 00:02:19.585 bbdev: explicitly disabled via build config 00:02:19.585 bitratestats: explicitly disabled via build config 00:02:19.585 bpf: explicitly disabled via build config 00:02:19.585 cfgfile: explicitly disabled via build config 00:02:19.585 distributor: explicitly disabled via build config 00:02:19.585 efd: explicitly disabled via build config 00:02:19.585 eventdev: explicitly disabled via build config 00:02:19.585 dispatcher: explicitly disabled via build config 00:02:19.585 gpudev: explicitly disabled via build config 00:02:19.585 gro: explicitly disabled via build config 00:02:19.585 gso: explicitly disabled via build config 00:02:19.585 ip_frag: explicitly disabled via build config 00:02:19.585 jobstats: explicitly disabled via build config 00:02:19.585 latencystats: explicitly disabled via build config 00:02:19.585 lpm: explicitly disabled via build config 00:02:19.585 member: explicitly disabled via build config 00:02:19.585 pcapng: explicitly disabled via build config 00:02:19.585 rawdev: explicitly disabled via build config 00:02:19.585 regexdev: explicitly disabled via build config 00:02:19.585 mldev: explicitly disabled via build config 00:02:19.585 rib: explicitly disabled via build config 00:02:19.585 sched: explicitly disabled via build config 00:02:19.585 stack: explicitly disabled via build config 00:02:19.585 ipsec: explicitly disabled via build config 00:02:19.585 pdcp: explicitly disabled via build config 00:02:19.585 fib: explicitly disabled via build config 00:02:19.585 port: explicitly disabled via build config 00:02:19.585 pdump: explicitly disabled via build config 00:02:19.585 table: explicitly disabled via build config 00:02:19.585 pipeline: explicitly disabled via build config 00:02:19.585 graph: explicitly disabled via build config 00:02:19.585 node: explicitly disabled via build config 00:02:19.585 00:02:19.585 drivers: 00:02:19.585 common/cpt: not in enabled drivers build config 00:02:19.585 common/dpaax: not in enabled drivers build config 00:02:19.585 common/iavf: not in enabled drivers build config 00:02:19.585 common/idpf: not in enabled drivers build config 00:02:19.585 common/ionic: not in enabled drivers build config 00:02:19.585 common/mvep: not in enabled drivers build config 00:02:19.585 common/octeontx: not in enabled drivers build config 00:02:19.585 bus/cdx: not in enabled drivers build config 00:02:19.585 bus/dpaa: not in enabled drivers build config 00:02:19.585 bus/fslmc: not in enabled drivers build config 00:02:19.585 bus/ifpga: not in enabled drivers build config 00:02:19.585 bus/platform: not in enabled drivers build config 00:02:19.585 bus/uacce: not in enabled drivers build config 00:02:19.585 bus/vmbus: not in enabled drivers build config 00:02:19.585 common/cnxk: not in enabled drivers build config 00:02:19.585 common/nfp: not in enabled drivers build config 00:02:19.585 common/nitrox: not in enabled drivers build config 00:02:19.585 common/sfc_efx: not in enabled drivers build config 00:02:19.585 mempool/bucket: not in enabled drivers build config 00:02:19.585 mempool/cnxk: not in enabled drivers build config 00:02:19.585 mempool/dpaa: not in enabled drivers build config 00:02:19.585 mempool/dpaa2: not in enabled drivers build config 00:02:19.585 mempool/octeontx: not in enabled drivers build config 00:02:19.585 mempool/stack: not in enabled drivers build config 00:02:19.585 dma/cnxk: not in enabled drivers build config 00:02:19.585 dma/dpaa: not in enabled drivers build config 00:02:19.585 dma/dpaa2: not in enabled drivers build config 00:02:19.585 dma/hisilicon: not in enabled drivers build config 00:02:19.585 dma/idxd: not in enabled drivers build config 00:02:19.585 dma/ioat: not in enabled drivers build config 00:02:19.585 dma/skeleton: not in enabled drivers build config 00:02:19.585 net/af_packet: not in enabled drivers build config 00:02:19.585 net/af_xdp: not in enabled drivers build config 00:02:19.585 net/ark: not in enabled drivers build config 00:02:19.585 net/atlantic: not in enabled drivers build config 00:02:19.585 net/avp: not in enabled drivers build config 00:02:19.585 net/axgbe: not in enabled drivers build config 00:02:19.585 net/bnx2x: not in enabled drivers build config 00:02:19.585 net/bnxt: not in enabled drivers build config 00:02:19.585 net/bonding: not in enabled drivers build config 00:02:19.585 net/cnxk: not in enabled drivers build config 00:02:19.585 net/cpfl: not in enabled drivers build config 00:02:19.585 net/cxgbe: not in enabled drivers build config 00:02:19.585 net/dpaa: not in enabled drivers build config 00:02:19.585 net/dpaa2: not in enabled drivers build config 00:02:19.585 net/e1000: not in enabled drivers build config 00:02:19.585 net/ena: not in enabled drivers build config 00:02:19.585 net/enetc: not in enabled drivers build config 00:02:19.585 net/enetfec: not in enabled drivers build config 00:02:19.585 net/enic: not in enabled drivers build config 00:02:19.585 net/failsafe: not in enabled drivers build config 00:02:19.585 net/fm10k: not in enabled drivers build config 00:02:19.585 net/gve: not in enabled drivers build config 00:02:19.585 net/hinic: not in enabled drivers build config 00:02:19.585 net/hns3: not in enabled drivers build config 00:02:19.585 net/i40e: not in enabled drivers build config 00:02:19.585 net/iavf: not in enabled drivers build config 00:02:19.585 net/ice: not in enabled drivers build config 00:02:19.585 net/idpf: not in enabled drivers build config 00:02:19.585 net/igc: not in enabled drivers build config 00:02:19.585 net/ionic: not in enabled drivers build config 00:02:19.585 net/ipn3ke: not in enabled drivers build config 00:02:19.585 net/ixgbe: not in enabled drivers build config 00:02:19.585 net/mana: not in enabled drivers build config 00:02:19.585 net/memif: not in enabled drivers build config 00:02:19.585 net/mlx4: not in enabled drivers build config 00:02:19.585 net/mlx5: not in enabled drivers build config 00:02:19.585 net/mvneta: not in enabled drivers build config 00:02:19.585 net/mvpp2: not in enabled drivers build config 00:02:19.585 net/netvsc: not in enabled drivers build config 00:02:19.585 net/nfb: not in enabled drivers build config 00:02:19.585 net/nfp: not in enabled drivers build config 00:02:19.585 net/ngbe: not in enabled drivers build config 00:02:19.585 net/null: not in enabled drivers build config 00:02:19.585 net/octeontx: not in enabled drivers build config 00:02:19.585 net/octeon_ep: not in enabled drivers build config 00:02:19.585 net/pcap: not in enabled drivers build config 00:02:19.585 net/pfe: not in enabled drivers build config 00:02:19.585 net/qede: not in enabled drivers build config 00:02:19.585 net/ring: not in enabled drivers build config 00:02:19.585 net/sfc: not in enabled drivers build config 00:02:19.585 net/softnic: not in enabled drivers build config 00:02:19.585 net/tap: not in enabled drivers build config 00:02:19.585 net/thunderx: not in enabled drivers build config 00:02:19.585 net/txgbe: not in enabled drivers build config 00:02:19.585 net/vdev_netvsc: not in enabled drivers build config 00:02:19.585 net/vhost: not in enabled drivers build config 00:02:19.585 net/virtio: not in enabled drivers build config 00:02:19.585 net/vmxnet3: not in enabled drivers build config 00:02:19.585 raw/*: missing internal dependency, "rawdev" 00:02:19.585 crypto/armv8: not in enabled drivers build config 00:02:19.585 crypto/bcmfs: not in enabled drivers build config 00:02:19.586 crypto/caam_jr: not in enabled drivers build config 00:02:19.586 crypto/ccp: not in enabled drivers build config 00:02:19.586 crypto/cnxk: not in enabled drivers build config 00:02:19.586 crypto/dpaa_sec: not in enabled drivers build config 00:02:19.586 crypto/dpaa2_sec: not in enabled drivers build config 00:02:19.586 crypto/mvsam: not in enabled drivers build config 00:02:19.586 crypto/nitrox: not in enabled drivers build config 00:02:19.586 crypto/null: not in enabled drivers build config 00:02:19.586 crypto/octeontx: not in enabled drivers build config 00:02:19.586 crypto/openssl: not in enabled drivers build config 00:02:19.586 crypto/scheduler: not in enabled drivers build config 00:02:19.586 crypto/uadk: not in enabled drivers build config 00:02:19.586 crypto/virtio: not in enabled drivers build config 00:02:19.586 compress/nitrox: not in enabled drivers build config 00:02:19.586 compress/octeontx: not in enabled drivers build config 00:02:19.586 compress/zlib: not in enabled drivers build config 00:02:19.586 regex/*: missing internal dependency, "regexdev" 00:02:19.586 ml/*: missing internal dependency, "mldev" 00:02:19.586 vdpa/ifc: not in enabled drivers build config 00:02:19.586 vdpa/mlx5: not in enabled drivers build config 00:02:19.586 vdpa/nfp: not in enabled drivers build config 00:02:19.586 vdpa/sfc: not in enabled drivers build config 00:02:19.586 event/*: missing internal dependency, "eventdev" 00:02:19.586 baseband/*: missing internal dependency, "bbdev" 00:02:19.586 gpu/*: missing internal dependency, "gpudev" 00:02:19.586 00:02:19.586 00:02:19.586 Build targets in project: 115 00:02:19.586 00:02:19.586 DPDK 24.03.0 00:02:19.586 00:02:19.586 User defined options 00:02:19.586 buildtype : debug 00:02:19.586 default_library : shared 00:02:19.586 libdir : lib 00:02:19.586 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:19.586 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:19.586 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:19.586 cpu_instruction_set: native 00:02:19.586 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:02:19.586 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,argparse,pcapng,bbdev 00:02:19.586 enable_docs : false 00:02:19.586 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:19.586 enable_kmods : false 00:02:19.586 max_lcores : 128 00:02:19.586 tests : false 00:02:19.586 00:02:19.586 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:20.155 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:20.155 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:20.155 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:20.155 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:20.155 [4/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:20.155 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:20.155 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:20.155 [7/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:20.155 [8/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:20.155 [9/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:20.155 [10/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:20.155 [11/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:20.414 [12/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:20.414 [13/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:20.414 [14/378] Linking static target lib/librte_kvargs.a 00:02:20.414 [15/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:20.414 [16/378] Linking static target lib/librte_log.a 00:02:20.414 [17/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:20.414 [18/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:20.414 [19/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:20.686 [20/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:20.686 [21/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:20.686 [22/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:20.686 [23/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:20.686 [24/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:20.686 [25/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:20.686 [26/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:20.686 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:20.686 [28/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:20.686 [29/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:20.686 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:20.686 [31/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:20.686 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:20.686 [33/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:20.686 [34/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:20.686 [35/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:20.686 [36/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:20.948 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:20.948 [38/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:20.948 [39/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:20.948 [40/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:20.948 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:20.948 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:20.948 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:20.948 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:20.948 [45/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:20.948 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:20.948 [47/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:20.948 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:20.948 [49/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:20.948 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:20.948 [51/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:20.948 [52/378] Linking static target lib/librte_telemetry.a 00:02:20.948 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:20.948 [54/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:20.948 [55/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:20.948 [56/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:20.948 [57/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:20.948 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:20.948 [59/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:20.948 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:20.948 [61/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:20.948 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:20.948 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:20.948 [64/378] Linking static target lib/librte_ring.a 00:02:20.948 [65/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:20.948 [66/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:20.948 [67/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:20.948 [68/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:20.948 [69/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:20.948 [70/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:20.948 [71/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:20.948 [72/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:20.948 [73/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:20.948 [74/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:20.948 [75/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:20.948 [76/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:20.948 [77/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:20.948 [78/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:20.948 [79/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:20.948 [80/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:20.948 [81/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:20.948 [82/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:20.948 [83/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:20.948 [84/378] Linking static target lib/librte_pci.a 00:02:20.948 [85/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:20.948 [86/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:20.948 [87/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:20.948 [88/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:20.948 [89/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:20.948 [90/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:20.948 [91/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:20.948 [92/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:20.948 [93/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:21.208 [94/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:21.208 [95/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:21.208 [96/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:21.208 [97/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:21.208 [98/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:21.208 [99/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:21.208 [100/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:21.208 [101/378] Linking static target lib/librte_mempool.a 00:02:21.208 [102/378] Linking static target lib/librte_rcu.a 00:02:21.208 [103/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:21.208 [104/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:21.208 [105/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:21.208 [106/378] Linking static target lib/librte_meter.a 00:02:21.208 [107/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:21.208 [108/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:21.208 [109/378] Linking static target lib/librte_net.a 00:02:21.208 [110/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:21.208 [111/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:21.208 [112/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:21.208 [113/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:21.208 [114/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.469 [115/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:21.469 [116/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:21.469 [117/378] Linking target lib/librte_log.so.24.1 00:02:21.469 [118/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.469 [119/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:21.469 [120/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:21.469 [121/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:21.469 [122/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.469 [123/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:21.469 [124/378] Linking static target lib/librte_mbuf.a 00:02:21.469 [125/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:21.469 [126/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:21.469 [127/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:21.469 [128/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:21.469 [129/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:21.469 [130/378] Linking static target lib/librte_cmdline.a 00:02:21.469 [131/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:21.469 [132/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.755 [133/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:21.755 [134/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:21.755 [135/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:21.755 [136/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:21.755 [137/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:21.755 [138/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:21.755 [139/378] Linking static target lib/librte_timer.a 00:02:21.755 [140/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:21.755 [141/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:21.755 [142/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:21.755 [143/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:21.755 [144/378] Linking target lib/librte_kvargs.so.24.1 00:02:21.755 [145/378] Linking target lib/librte_telemetry.so.24.1 00:02:21.755 [146/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:21.755 [147/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:21.755 [148/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:21.756 [149/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.756 [150/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:21.756 [151/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:21.756 [152/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:21.756 [153/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:21.756 [154/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:21.756 [155/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:21.756 [156/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.756 [157/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.756 [158/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:21.756 [159/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:21.756 [160/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:21.756 [161/378] Linking static target lib/librte_eal.a 00:02:21.756 [162/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:21.756 [163/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:21.756 [164/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:21.756 [165/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:21.756 [166/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:21.756 [167/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:21.756 [168/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:21.756 [169/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:21.756 [170/378] Linking static target lib/librte_compressdev.a 00:02:21.756 [171/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:21.756 [172/378] Linking static target lib/librte_dmadev.a 00:02:21.756 [173/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:21.756 [174/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:21.756 [175/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:21.756 [176/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:21.756 [177/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:21.756 [178/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:21.756 [179/378] Linking static target lib/librte_power.a 00:02:22.014 [180/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:22.014 [181/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:22.014 [182/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:22.014 [183/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:22.014 [184/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:22.014 [185/378] Linking static target lib/librte_reorder.a 00:02:22.014 [186/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:22.014 [187/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:22.014 [188/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:22.014 [189/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:22.014 [190/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:22.014 [191/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:22.014 [192/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:22.014 [193/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:22.014 [194/378] Linking static target lib/librte_security.a 00:02:22.014 [195/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:22.014 [196/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:22.277 [197/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:22.277 [198/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:22.277 [199/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:22.277 [200/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:22.277 [201/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:22.277 [202/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:22.277 [203/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:22.277 [204/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:22.277 [205/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:22.277 [206/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:22.277 [207/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:22.277 [208/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:22.277 [209/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:22.277 [210/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:22.277 [211/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:22.277 [212/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:22.277 [213/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:22.277 [214/378] Linking static target lib/librte_hash.a 00:02:22.277 [215/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:22.277 [216/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:22.277 [217/378] Linking static target drivers/librte_bus_vdev.a 00:02:22.277 [218/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:22.277 [219/378] Linking static target lib/librte_cryptodev.a 00:02:22.538 [220/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.538 [221/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:22.538 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:22.538 [223/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:22.538 [224/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:22.538 [225/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.538 [226/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:22.538 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:22.538 [228/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:22.538 [229/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:22.538 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:22.538 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:22.538 [232/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:22.538 [233/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:22.538 [234/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:22.538 [235/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.538 [236/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:22.538 [237/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:22.538 [238/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:22.538 [239/378] Linking static target drivers/librte_bus_pci.a 00:02:22.538 [240/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:22.538 [241/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:22.538 [242/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:22.538 [243/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:22.538 [244/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:22.538 [245/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:22.538 [246/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.538 [247/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:22.538 [248/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.538 [249/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:22.538 [250/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.538 [251/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.538 [252/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:22.538 [253/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:22.538 [254/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:22.797 [255/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:22.797 [256/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.797 [257/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:22.797 [258/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:22.797 [259/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:22.797 [260/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.797 [261/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:22.797 [262/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:22.797 [263/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:22.797 [264/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.797 [265/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:22.797 [266/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:22.797 [267/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:22.797 [268/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:23.056 [269/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:23.056 [270/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:23.056 [271/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:23.056 [272/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:23.056 [273/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:23.056 [274/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:23.056 [275/378] Linking static target drivers/librte_mempool_ring.a 00:02:23.056 [276/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.056 [277/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:23.056 [278/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:23.056 [279/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:23.056 [280/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:23.056 [281/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:23.056 [282/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:23.056 [283/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:23.056 [284/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:23.056 [285/378] Linking static target lib/librte_ethdev.a 00:02:23.056 [286/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:23.056 [287/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:23.056 [288/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:23.056 [289/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:23.056 [290/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:23.056 [291/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:23.056 [292/378] Linking static target drivers/librte_compress_mlx5.a 00:02:23.056 [293/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:23.315 [294/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:23.315 [295/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:23.315 [296/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.315 [297/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:23.315 [298/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.315 [299/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:23.315 [300/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:23.315 [301/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:23.315 [302/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:23.315 [303/378] Linking static target drivers/librte_compress_isal.a 00:02:23.575 [304/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:23.575 [305/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:23.575 [306/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:23.575 [307/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:23.575 [308/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:23.575 [309/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:23.575 [310/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:23.575 [311/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:23.575 [312/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:23.575 [313/378] Linking static target drivers/librte_common_mlx5.a 00:02:23.833 [314/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:23.833 [315/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:23.833 [316/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:23.833 [317/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:24.091 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:24.091 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:24.351 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:24.351 [321/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:24.351 [322/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:24.351 [323/378] Linking static target drivers/librte_common_qat.a 00:02:24.610 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:24.610 [325/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.610 [326/378] Linking static target lib/librte_vhost.a 00:02:27.187 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.723 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.916 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.852 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.109 [331/378] Linking target lib/librte_eal.so.24.1 00:02:35.109 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:35.367 [333/378] Linking target lib/librte_timer.so.24.1 00:02:35.367 [334/378] Linking target lib/librte_meter.so.24.1 00:02:35.367 [335/378] Linking target lib/librte_pci.so.24.1 00:02:35.367 [336/378] Linking target lib/librte_dmadev.so.24.1 00:02:35.367 [337/378] Linking target lib/librte_ring.so.24.1 00:02:35.367 [338/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:35.367 [339/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:35.367 [340/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:35.367 [341/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:35.367 [342/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:35.367 [343/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:35.367 [344/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:35.367 [345/378] Linking target lib/librte_mempool.so.24.1 00:02:35.625 [346/378] Linking target lib/librte_rcu.so.24.1 00:02:35.625 [347/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:35.625 [348/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:35.625 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:35.625 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:35.625 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:35.625 [352/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:35.625 [353/378] Linking target lib/librte_mbuf.so.24.1 00:02:35.884 [354/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:35.884 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:35.884 [356/378] Linking target lib/librte_reorder.so.24.1 00:02:35.884 [357/378] Linking target lib/librte_compressdev.so.24.1 00:02:35.884 [358/378] Linking target lib/librte_net.so.24.1 00:02:35.884 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:02:36.142 [360/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:36.142 [361/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:36.142 [362/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:36.142 [363/378] Linking target lib/librte_cmdline.so.24.1 00:02:36.142 [364/378] Linking target lib/librte_security.so.24.1 00:02:36.142 [365/378] Linking target lib/librte_hash.so.24.1 00:02:36.142 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:36.401 [367/378] Linking target lib/librte_ethdev.so.24.1 00:02:36.401 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:36.401 [369/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:36.401 [370/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:36.401 [371/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:36.659 [372/378] Linking target lib/librte_vhost.so.24.1 00:02:36.659 [373/378] Linking target lib/librte_power.so.24.1 00:02:36.659 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:36.659 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:36.659 [376/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:36.659 [377/378] Linking target drivers/librte_common_qat.so.24.1 00:02:36.659 [378/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:36.659 INFO: autodetecting backend as ninja 00:02:36.659 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:38.033 CC lib/ut/ut.o 00:02:38.033 CC lib/ut_mock/mock.o 00:02:38.033 CC lib/log/log.o 00:02:38.033 CC lib/log/log_flags.o 00:02:38.033 CC lib/log/log_deprecated.o 00:02:38.291 LIB libspdk_ut.a 00:02:38.291 LIB libspdk_log.a 00:02:38.291 SO libspdk_ut.so.2.0 00:02:38.291 SO libspdk_log.so.7.0 00:02:38.291 SYMLINK libspdk_ut.so 00:02:38.291 SYMLINK libspdk_log.so 00:02:38.291 LIB libspdk_ut_mock.a 00:02:38.551 SO libspdk_ut_mock.so.6.0 00:02:38.551 SYMLINK libspdk_ut_mock.so 00:02:38.808 CXX lib/trace_parser/trace.o 00:02:38.808 CC lib/ioat/ioat.o 00:02:38.808 CC lib/util/base64.o 00:02:38.808 CC lib/util/bit_array.o 00:02:38.808 CC lib/util/cpuset.o 00:02:38.808 CC lib/dma/dma.o 00:02:38.808 CC lib/util/crc16.o 00:02:38.808 CC lib/util/crc32.o 00:02:38.808 CC lib/util/crc32c.o 00:02:38.808 CC lib/util/crc32_ieee.o 00:02:38.808 CC lib/util/crc64.o 00:02:38.808 CC lib/util/dif.o 00:02:38.808 CC lib/util/fd.o 00:02:38.808 CC lib/util/file.o 00:02:38.808 CC lib/util/hexlify.o 00:02:38.808 CC lib/util/iov.o 00:02:38.808 CC lib/util/math.o 00:02:38.808 CC lib/util/pipe.o 00:02:38.808 CC lib/util/strerror_tls.o 00:02:38.808 CC lib/util/string.o 00:02:38.808 CC lib/util/uuid.o 00:02:38.808 CC lib/util/fd_group.o 00:02:38.808 CC lib/util/xor.o 00:02:38.808 CC lib/util/zipf.o 00:02:38.808 CC lib/vfio_user/host/vfio_user_pci.o 00:02:38.808 CC lib/vfio_user/host/vfio_user.o 00:02:39.067 LIB libspdk_ioat.a 00:02:39.067 SO libspdk_ioat.so.7.0 00:02:39.067 SYMLINK libspdk_ioat.so 00:02:39.067 LIB libspdk_vfio_user.a 00:02:39.067 LIB libspdk_dma.a 00:02:39.325 SO libspdk_vfio_user.so.5.0 00:02:39.325 SO libspdk_dma.so.4.0 00:02:39.325 SYMLINK libspdk_vfio_user.so 00:02:39.325 SYMLINK libspdk_dma.so 00:02:39.325 LIB libspdk_util.a 00:02:39.325 SO libspdk_util.so.9.1 00:02:39.584 SYMLINK libspdk_util.so 00:02:39.584 LIB libspdk_trace_parser.a 00:02:39.584 SO libspdk_trace_parser.so.5.0 00:02:39.842 SYMLINK libspdk_trace_parser.so 00:02:39.842 CC lib/vmd/vmd.o 00:02:39.842 CC lib/vmd/led.o 00:02:39.842 CC lib/rdma_utils/rdma_utils.o 00:02:39.842 CC lib/rdma_provider/common.o 00:02:39.842 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:39.842 CC lib/reduce/reduce.o 00:02:39.842 CC lib/json/json_parse.o 00:02:39.842 CC lib/json/json_util.o 00:02:39.842 CC lib/idxd/idxd_user.o 00:02:39.842 CC lib/idxd/idxd.o 00:02:39.842 CC lib/json/json_write.o 00:02:39.842 CC lib/conf/conf.o 00:02:39.842 CC lib/idxd/idxd_kernel.o 00:02:40.100 CC lib/env_dpdk/env.o 00:02:40.100 CC lib/env_dpdk/pci.o 00:02:40.100 CC lib/env_dpdk/memory.o 00:02:40.100 CC lib/env_dpdk/init.o 00:02:40.100 CC lib/env_dpdk/threads.o 00:02:40.100 CC lib/env_dpdk/pci_ioat.o 00:02:40.100 CC lib/env_dpdk/pci_virtio.o 00:02:40.100 CC lib/env_dpdk/pci_vmd.o 00:02:40.100 CC lib/env_dpdk/pci_idxd.o 00:02:40.100 CC lib/env_dpdk/pci_event.o 00:02:40.100 CC lib/env_dpdk/sigbus_handler.o 00:02:40.100 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:40.100 CC lib/env_dpdk/pci_dpdk.o 00:02:40.100 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:40.100 LIB libspdk_rdma_provider.a 00:02:40.358 LIB libspdk_conf.a 00:02:40.358 SO libspdk_rdma_provider.so.6.0 00:02:40.358 LIB libspdk_rdma_utils.a 00:02:40.358 SO libspdk_conf.so.6.0 00:02:40.358 LIB libspdk_json.a 00:02:40.358 SO libspdk_rdma_utils.so.1.0 00:02:40.358 SYMLINK libspdk_rdma_provider.so 00:02:40.358 SO libspdk_json.so.6.0 00:02:40.358 SYMLINK libspdk_conf.so 00:02:40.358 SYMLINK libspdk_rdma_utils.so 00:02:40.358 SYMLINK libspdk_json.so 00:02:40.617 LIB libspdk_idxd.a 00:02:40.617 SO libspdk_idxd.so.12.0 00:02:40.617 LIB libspdk_reduce.a 00:02:40.617 LIB libspdk_vmd.a 00:02:40.617 SO libspdk_reduce.so.6.0 00:02:40.617 SO libspdk_vmd.so.6.0 00:02:40.617 SYMLINK libspdk_idxd.so 00:02:40.876 SYMLINK libspdk_reduce.so 00:02:40.876 SYMLINK libspdk_vmd.so 00:02:40.876 CC lib/jsonrpc/jsonrpc_server.o 00:02:40.876 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:40.876 CC lib/jsonrpc/jsonrpc_client.o 00:02:40.876 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:41.134 LIB libspdk_jsonrpc.a 00:02:41.134 SO libspdk_jsonrpc.so.6.0 00:02:41.393 SYMLINK libspdk_jsonrpc.so 00:02:41.393 LIB libspdk_env_dpdk.a 00:02:41.653 SO libspdk_env_dpdk.so.14.1 00:02:41.653 CC lib/rpc/rpc.o 00:02:41.653 SYMLINK libspdk_env_dpdk.so 00:02:41.912 LIB libspdk_rpc.a 00:02:41.912 SO libspdk_rpc.so.6.0 00:02:41.912 SYMLINK libspdk_rpc.so 00:02:42.481 CC lib/trace/trace.o 00:02:42.481 CC lib/trace/trace_flags.o 00:02:42.481 CC lib/trace/trace_rpc.o 00:02:42.481 CC lib/keyring/keyring.o 00:02:42.481 CC lib/notify/notify.o 00:02:42.481 CC lib/keyring/keyring_rpc.o 00:02:42.481 CC lib/notify/notify_rpc.o 00:02:42.481 LIB libspdk_notify.a 00:02:42.481 SO libspdk_notify.so.6.0 00:02:42.741 LIB libspdk_keyring.a 00:02:42.741 SYMLINK libspdk_notify.so 00:02:42.741 SO libspdk_keyring.so.1.0 00:02:42.741 SYMLINK libspdk_keyring.so 00:02:42.741 LIB libspdk_trace.a 00:02:43.000 SO libspdk_trace.so.10.0 00:02:43.000 SYMLINK libspdk_trace.so 00:02:43.260 CC lib/thread/thread.o 00:02:43.260 CC lib/thread/iobuf.o 00:02:43.260 CC lib/sock/sock.o 00:02:43.260 CC lib/sock/sock_rpc.o 00:02:43.829 LIB libspdk_sock.a 00:02:43.829 SO libspdk_sock.so.10.0 00:02:43.829 SYMLINK libspdk_sock.so 00:02:44.398 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:44.398 CC lib/nvme/nvme_ctrlr.o 00:02:44.398 CC lib/nvme/nvme_fabric.o 00:02:44.398 CC lib/nvme/nvme_ns_cmd.o 00:02:44.398 CC lib/nvme/nvme_ns.o 00:02:44.398 CC lib/nvme/nvme_pcie_common.o 00:02:44.398 CC lib/nvme/nvme_pcie.o 00:02:44.398 CC lib/nvme/nvme_qpair.o 00:02:44.398 CC lib/nvme/nvme.o 00:02:44.398 CC lib/nvme/nvme_quirks.o 00:02:44.398 CC lib/nvme/nvme_discovery.o 00:02:44.398 CC lib/nvme/nvme_transport.o 00:02:44.398 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:44.398 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:44.398 CC lib/nvme/nvme_tcp.o 00:02:44.398 CC lib/nvme/nvme_opal.o 00:02:44.398 CC lib/nvme/nvme_io_msg.o 00:02:44.398 CC lib/nvme/nvme_poll_group.o 00:02:44.398 CC lib/nvme/nvme_zns.o 00:02:44.398 CC lib/nvme/nvme_stubs.o 00:02:44.398 CC lib/nvme/nvme_auth.o 00:02:44.398 CC lib/nvme/nvme_cuse.o 00:02:44.398 CC lib/nvme/nvme_rdma.o 00:02:44.966 LIB libspdk_thread.a 00:02:44.966 SO libspdk_thread.so.10.1 00:02:45.281 SYMLINK libspdk_thread.so 00:02:45.567 CC lib/virtio/virtio_vhost_user.o 00:02:45.567 CC lib/virtio/virtio.o 00:02:45.567 CC lib/virtio/virtio_pci.o 00:02:45.567 CC lib/virtio/virtio_vfio_user.o 00:02:45.567 CC lib/blob/blobstore.o 00:02:45.567 CC lib/blob/request.o 00:02:45.567 CC lib/blob/zeroes.o 00:02:45.567 CC lib/blob/blob_bs_dev.o 00:02:45.567 CC lib/accel/accel.o 00:02:45.567 CC lib/accel/accel_rpc.o 00:02:45.567 CC lib/init/json_config.o 00:02:45.567 CC lib/accel/accel_sw.o 00:02:45.567 CC lib/init/subsystem.o 00:02:45.567 CC lib/init/subsystem_rpc.o 00:02:45.567 CC lib/init/rpc.o 00:02:45.826 LIB libspdk_init.a 00:02:45.826 LIB libspdk_virtio.a 00:02:45.826 SO libspdk_init.so.5.0 00:02:45.826 SO libspdk_virtio.so.7.0 00:02:46.085 SYMLINK libspdk_init.so 00:02:46.085 SYMLINK libspdk_virtio.so 00:02:46.344 LIB libspdk_nvme.a 00:02:46.344 CC lib/event/app.o 00:02:46.344 CC lib/event/log_rpc.o 00:02:46.344 CC lib/event/reactor.o 00:02:46.344 CC lib/event/app_rpc.o 00:02:46.344 CC lib/event/scheduler_static.o 00:02:46.344 SO libspdk_nvme.so.13.1 00:02:46.603 LIB libspdk_accel.a 00:02:46.603 SO libspdk_accel.so.15.1 00:02:46.603 SYMLINK libspdk_accel.so 00:02:46.861 SYMLINK libspdk_nvme.so 00:02:46.861 LIB libspdk_event.a 00:02:46.861 SO libspdk_event.so.14.0 00:02:46.861 SYMLINK libspdk_event.so 00:02:47.120 CC lib/bdev/bdev.o 00:02:47.120 CC lib/bdev/bdev_rpc.o 00:02:47.120 CC lib/bdev/bdev_zone.o 00:02:47.120 CC lib/bdev/part.o 00:02:47.120 CC lib/bdev/scsi_nvme.o 00:02:48.498 LIB libspdk_blob.a 00:02:48.498 SO libspdk_blob.so.11.0 00:02:48.758 SYMLINK libspdk_blob.so 00:02:49.017 CC lib/lvol/lvol.o 00:02:49.017 CC lib/blobfs/blobfs.o 00:02:49.017 CC lib/blobfs/tree.o 00:02:49.585 LIB libspdk_bdev.a 00:02:49.585 SO libspdk_bdev.so.15.1 00:02:49.846 SYMLINK libspdk_bdev.so 00:02:49.846 LIB libspdk_blobfs.a 00:02:50.135 SO libspdk_blobfs.so.10.0 00:02:50.135 LIB libspdk_lvol.a 00:02:50.135 SYMLINK libspdk_blobfs.so 00:02:50.135 SO libspdk_lvol.so.10.0 00:02:50.135 SYMLINK libspdk_lvol.so 00:02:50.135 CC lib/nvmf/ctrlr_discovery.o 00:02:50.135 CC lib/nvmf/ctrlr.o 00:02:50.135 CC lib/scsi/dev.o 00:02:50.135 CC lib/scsi/lun.o 00:02:50.135 CC lib/nvmf/ctrlr_bdev.o 00:02:50.135 CC lib/nvmf/subsystem.o 00:02:50.135 CC lib/nvmf/nvmf.o 00:02:50.135 CC lib/nvmf/nvmf_rpc.o 00:02:50.135 CC lib/scsi/scsi.o 00:02:50.135 CC lib/scsi/port.o 00:02:50.135 CC lib/nvmf/tcp.o 00:02:50.135 CC lib/nvmf/transport.o 00:02:50.135 CC lib/ftl/ftl_core.o 00:02:50.135 CC lib/scsi/scsi_bdev.o 00:02:50.135 CC lib/ftl/ftl_init.o 00:02:50.135 CC lib/scsi/scsi_pr.o 00:02:50.135 CC lib/nvmf/stubs.o 00:02:50.135 CC lib/ftl/ftl_layout.o 00:02:50.135 CC lib/ftl/ftl_io.o 00:02:50.135 CC lib/scsi/scsi_rpc.o 00:02:50.135 CC lib/nvmf/mdns_server.o 00:02:50.135 CC lib/scsi/task.o 00:02:50.135 CC lib/ftl/ftl_debug.o 00:02:50.135 CC lib/nbd/nbd.o 00:02:50.135 CC lib/ublk/ublk.o 00:02:50.135 CC lib/nvmf/rdma.o 00:02:50.135 CC lib/ublk/ublk_rpc.o 00:02:50.135 CC lib/nbd/nbd_rpc.o 00:02:50.135 CC lib/nvmf/auth.o 00:02:50.135 CC lib/ftl/ftl_sb.o 00:02:50.135 CC lib/ftl/ftl_l2p.o 00:02:50.135 CC lib/ftl/ftl_l2p_flat.o 00:02:50.135 CC lib/ftl/ftl_nv_cache.o 00:02:50.135 CC lib/ftl/ftl_band.o 00:02:50.135 CC lib/ftl/ftl_band_ops.o 00:02:50.135 CC lib/ftl/ftl_writer.o 00:02:50.135 CC lib/ftl/ftl_rq.o 00:02:50.135 CC lib/ftl/ftl_reloc.o 00:02:50.135 CC lib/ftl/ftl_l2p_cache.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt.o 00:02:50.135 CC lib/ftl/ftl_p2l.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:50.135 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:50.135 CC lib/ftl/utils/ftl_conf.o 00:02:50.135 CC lib/ftl/utils/ftl_md.o 00:02:50.135 CC lib/ftl/utils/ftl_bitmap.o 00:02:50.135 CC lib/ftl/utils/ftl_mempool.o 00:02:50.135 CC lib/ftl/utils/ftl_property.o 00:02:50.135 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:50.135 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:50.135 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:50.135 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:50.135 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:50.135 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:50.135 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:50.135 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:50.135 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:50.135 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:50.135 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:50.135 CC lib/ftl/base/ftl_base_dev.o 00:02:50.135 CC lib/ftl/base/ftl_base_bdev.o 00:02:50.135 CC lib/ftl/ftl_trace.o 00:02:51.074 LIB libspdk_scsi.a 00:02:51.074 LIB libspdk_nbd.a 00:02:51.074 SO libspdk_nbd.so.7.0 00:02:51.074 SO libspdk_scsi.so.9.0 00:02:51.074 SYMLINK libspdk_nbd.so 00:02:51.074 SYMLINK libspdk_scsi.so 00:02:51.074 LIB libspdk_ublk.a 00:02:51.074 SO libspdk_ublk.so.3.0 00:02:51.332 SYMLINK libspdk_ublk.so 00:02:51.332 LIB libspdk_ftl.a 00:02:51.332 CC lib/iscsi/conn.o 00:02:51.332 CC lib/iscsi/init_grp.o 00:02:51.332 CC lib/iscsi/iscsi.o 00:02:51.332 CC lib/iscsi/md5.o 00:02:51.332 CC lib/iscsi/param.o 00:02:51.332 CC lib/iscsi/portal_grp.o 00:02:51.332 SO libspdk_ftl.so.9.0 00:02:51.332 CC lib/vhost/vhost.o 00:02:51.332 CC lib/iscsi/tgt_node.o 00:02:51.332 CC lib/iscsi/iscsi_subsystem.o 00:02:51.332 CC lib/vhost/vhost_rpc.o 00:02:51.332 CC lib/vhost/vhost_scsi.o 00:02:51.332 CC lib/iscsi/iscsi_rpc.o 00:02:51.332 CC lib/iscsi/task.o 00:02:51.332 CC lib/vhost/vhost_blk.o 00:02:51.332 CC lib/vhost/rte_vhost_user.o 00:02:51.901 SYMLINK libspdk_ftl.so 00:02:52.470 LIB libspdk_vhost.a 00:02:52.470 LIB libspdk_nvmf.a 00:02:52.729 SO libspdk_vhost.so.8.0 00:02:52.729 SO libspdk_nvmf.so.18.1 00:02:52.730 SYMLINK libspdk_vhost.so 00:02:52.730 LIB libspdk_iscsi.a 00:02:52.990 SO libspdk_iscsi.so.8.0 00:02:53.249 SYMLINK libspdk_nvmf.so 00:02:53.249 SYMLINK libspdk_iscsi.so 00:02:53.817 CC module/env_dpdk/env_dpdk_rpc.o 00:02:53.817 CC module/accel/error/accel_error.o 00:02:53.817 CC module/accel/error/accel_error_rpc.o 00:02:53.817 LIB libspdk_env_dpdk_rpc.a 00:02:53.817 CC module/accel/dsa/accel_dsa.o 00:02:53.817 CC module/accel/dsa/accel_dsa_rpc.o 00:02:53.817 CC module/keyring/linux/keyring.o 00:02:53.817 CC module/keyring/linux/keyring_rpc.o 00:02:53.817 CC module/keyring/file/keyring.o 00:02:53.817 CC module/accel/iaa/accel_iaa.o 00:02:53.818 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:53.818 CC module/scheduler/gscheduler/gscheduler.o 00:02:53.818 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:53.818 CC module/accel/iaa/accel_iaa_rpc.o 00:02:53.818 CC module/keyring/file/keyring_rpc.o 00:02:53.818 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:53.818 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:53.818 CC module/sock/posix/posix.o 00:02:53.818 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:53.818 CC module/accel/ioat/accel_ioat.o 00:02:53.818 CC module/accel/ioat/accel_ioat_rpc.o 00:02:53.818 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:53.818 CC module/blob/bdev/blob_bdev.o 00:02:53.818 SO libspdk_env_dpdk_rpc.so.6.0 00:02:54.076 SYMLINK libspdk_env_dpdk_rpc.so 00:02:54.076 LIB libspdk_keyring_linux.a 00:02:54.076 LIB libspdk_scheduler_dpdk_governor.a 00:02:54.076 LIB libspdk_accel_iaa.a 00:02:54.076 LIB libspdk_scheduler_gscheduler.a 00:02:54.076 SO libspdk_keyring_linux.so.1.0 00:02:54.076 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:54.076 SO libspdk_scheduler_gscheduler.so.4.0 00:02:54.076 SO libspdk_accel_iaa.so.3.0 00:02:54.076 LIB libspdk_accel_ioat.a 00:02:54.076 LIB libspdk_scheduler_dynamic.a 00:02:54.076 LIB libspdk_accel_dsa.a 00:02:54.076 SO libspdk_accel_ioat.so.6.0 00:02:54.076 SO libspdk_scheduler_dynamic.so.4.0 00:02:54.335 SYMLINK libspdk_keyring_linux.so 00:02:54.335 LIB libspdk_accel_error.a 00:02:54.335 SYMLINK libspdk_scheduler_gscheduler.so 00:02:54.335 LIB libspdk_keyring_file.a 00:02:54.335 LIB libspdk_blob_bdev.a 00:02:54.335 SO libspdk_accel_dsa.so.5.0 00:02:54.335 SYMLINK libspdk_accel_iaa.so 00:02:54.335 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:54.335 SYMLINK libspdk_accel_ioat.so 00:02:54.335 SYMLINK libspdk_scheduler_dynamic.so 00:02:54.335 SO libspdk_accel_error.so.2.0 00:02:54.335 SO libspdk_blob_bdev.so.11.0 00:02:54.335 SO libspdk_keyring_file.so.1.0 00:02:54.335 SYMLINK libspdk_accel_dsa.so 00:02:54.335 SYMLINK libspdk_accel_error.so 00:02:54.335 SYMLINK libspdk_blob_bdev.so 00:02:54.335 SYMLINK libspdk_keyring_file.so 00:02:54.594 LIB libspdk_sock_posix.a 00:02:54.854 SO libspdk_sock_posix.so.6.0 00:02:54.854 CC module/bdev/aio/bdev_aio_rpc.o 00:02:54.854 CC module/bdev/aio/bdev_aio.o 00:02:54.854 SYMLINK libspdk_sock_posix.so 00:02:54.854 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:54.854 CC module/bdev/gpt/gpt.o 00:02:54.854 CC module/bdev/gpt/vbdev_gpt.o 00:02:54.854 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:54.854 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:54.854 CC module/bdev/lvol/vbdev_lvol.o 00:02:54.854 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:54.854 CC module/bdev/error/vbdev_error.o 00:02:54.854 CC module/bdev/error/vbdev_error_rpc.o 00:02:54.854 CC module/bdev/delay/vbdev_delay.o 00:02:54.854 CC module/bdev/passthru/vbdev_passthru.o 00:02:54.854 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:54.854 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:54.854 CC module/bdev/null/bdev_null.o 00:02:54.854 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:54.854 CC module/bdev/nvme/bdev_nvme.o 00:02:54.854 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:54.854 CC module/bdev/malloc/bdev_malloc.o 00:02:54.854 CC module/bdev/null/bdev_null_rpc.o 00:02:54.854 CC module/bdev/nvme/nvme_rpc.o 00:02:54.854 CC module/bdev/nvme/vbdev_opal.o 00:02:54.854 CC module/bdev/nvme/bdev_mdns_client.o 00:02:54.854 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:54.854 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:54.854 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:54.854 CC module/bdev/iscsi/bdev_iscsi.o 00:02:54.854 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:54.854 CC module/blobfs/bdev/blobfs_bdev.o 00:02:54.854 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:54.854 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:54.854 CC module/bdev/crypto/vbdev_crypto.o 00:02:54.854 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:54.854 CC module/bdev/split/vbdev_split_rpc.o 00:02:54.854 CC module/bdev/split/vbdev_split.o 00:02:54.854 CC module/bdev/compress/vbdev_compress.o 00:02:54.854 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:54.854 CC module/bdev/ftl/bdev_ftl.o 00:02:54.854 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:54.854 CC module/bdev/raid/bdev_raid_rpc.o 00:02:54.854 CC module/bdev/raid/bdev_raid.o 00:02:54.854 CC module/bdev/raid/raid0.o 00:02:54.854 CC module/bdev/raid/bdev_raid_sb.o 00:02:54.854 CC module/bdev/raid/raid1.o 00:02:54.854 CC module/bdev/raid/concat.o 00:02:55.114 LIB libspdk_blobfs_bdev.a 00:02:55.114 SO libspdk_blobfs_bdev.so.6.0 00:02:55.114 LIB libspdk_bdev_null.a 00:02:55.114 LIB libspdk_bdev_delay.a 00:02:55.114 SYMLINK libspdk_blobfs_bdev.so 00:02:55.114 LIB libspdk_bdev_gpt.a 00:02:55.114 SO libspdk_bdev_delay.so.6.0 00:02:55.114 SO libspdk_bdev_null.so.6.0 00:02:55.114 LIB libspdk_bdev_aio.a 00:02:55.374 SO libspdk_bdev_gpt.so.6.0 00:02:55.374 LIB libspdk_bdev_split.a 00:02:55.374 LIB libspdk_accel_dpdk_compressdev.a 00:02:55.374 LIB libspdk_bdev_malloc.a 00:02:55.374 LIB libspdk_bdev_iscsi.a 00:02:55.374 SO libspdk_bdev_aio.so.6.0 00:02:55.374 SYMLINK libspdk_bdev_delay.so 00:02:55.374 SO libspdk_bdev_split.so.6.0 00:02:55.374 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:55.374 SYMLINK libspdk_bdev_null.so 00:02:55.374 SO libspdk_bdev_malloc.so.6.0 00:02:55.374 SO libspdk_bdev_iscsi.so.6.0 00:02:55.374 SYMLINK libspdk_bdev_gpt.so 00:02:55.374 LIB libspdk_accel_dpdk_cryptodev.a 00:02:55.374 LIB libspdk_bdev_passthru.a 00:02:55.374 LIB libspdk_bdev_ftl.a 00:02:55.374 SYMLINK libspdk_bdev_aio.so 00:02:55.374 LIB libspdk_bdev_zone_block.a 00:02:55.374 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:55.374 SYMLINK libspdk_bdev_split.so 00:02:55.374 SO libspdk_bdev_passthru.so.6.0 00:02:55.374 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:55.374 SO libspdk_bdev_ftl.so.6.0 00:02:55.374 SYMLINK libspdk_bdev_iscsi.so 00:02:55.374 SYMLINK libspdk_bdev_malloc.so 00:02:55.374 LIB libspdk_bdev_lvol.a 00:02:55.374 SO libspdk_bdev_zone_block.so.6.0 00:02:55.374 LIB libspdk_bdev_compress.a 00:02:55.374 LIB libspdk_bdev_error.a 00:02:55.374 LIB libspdk_bdev_virtio.a 00:02:55.374 SYMLINK libspdk_bdev_passthru.so 00:02:55.374 SO libspdk_bdev_lvol.so.6.0 00:02:55.374 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:55.374 SYMLINK libspdk_bdev_ftl.so 00:02:55.374 SO libspdk_bdev_error.so.6.0 00:02:55.374 SO libspdk_bdev_compress.so.6.0 00:02:55.374 SYMLINK libspdk_bdev_zone_block.so 00:02:55.633 SO libspdk_bdev_virtio.so.6.0 00:02:55.633 SYMLINK libspdk_bdev_lvol.so 00:02:55.633 SYMLINK libspdk_bdev_error.so 00:02:55.633 LIB libspdk_bdev_crypto.a 00:02:55.633 SYMLINK libspdk_bdev_compress.so 00:02:55.633 SYMLINK libspdk_bdev_virtio.so 00:02:55.633 SO libspdk_bdev_crypto.so.6.0 00:02:55.633 SYMLINK libspdk_bdev_crypto.so 00:02:56.201 LIB libspdk_bdev_raid.a 00:02:56.201 SO libspdk_bdev_raid.so.6.0 00:02:56.201 SYMLINK libspdk_bdev_raid.so 00:02:57.139 LIB libspdk_bdev_nvme.a 00:02:57.139 SO libspdk_bdev_nvme.so.7.0 00:02:57.398 SYMLINK libspdk_bdev_nvme.so 00:02:58.335 CC module/event/subsystems/iobuf/iobuf.o 00:02:58.335 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:58.335 CC module/event/subsystems/scheduler/scheduler.o 00:02:58.335 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:58.335 CC module/event/subsystems/sock/sock.o 00:02:58.335 CC module/event/subsystems/vmd/vmd.o 00:02:58.335 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:58.335 CC module/event/subsystems/keyring/keyring.o 00:02:58.335 LIB libspdk_event_vhost_blk.a 00:02:58.335 LIB libspdk_event_sock.a 00:02:58.335 LIB libspdk_event_scheduler.a 00:02:58.335 LIB libspdk_event_vmd.a 00:02:58.335 LIB libspdk_event_iobuf.a 00:02:58.335 SO libspdk_event_vhost_blk.so.3.0 00:02:58.335 SO libspdk_event_sock.so.5.0 00:02:58.335 SO libspdk_event_scheduler.so.4.0 00:02:58.335 SO libspdk_event_iobuf.so.3.0 00:02:58.335 SO libspdk_event_vmd.so.6.0 00:02:58.335 SYMLINK libspdk_event_sock.so 00:02:58.335 SYMLINK libspdk_event_scheduler.so 00:02:58.335 LIB libspdk_event_keyring.a 00:02:58.335 SYMLINK libspdk_event_vmd.so 00:02:58.335 SYMLINK libspdk_event_iobuf.so 00:02:58.594 SYMLINK libspdk_event_vhost_blk.so 00:02:58.594 SO libspdk_event_keyring.so.1.0 00:02:58.594 SYMLINK libspdk_event_keyring.so 00:02:58.853 CC module/event/subsystems/accel/accel.o 00:02:58.853 LIB libspdk_event_accel.a 00:02:59.112 SO libspdk_event_accel.so.6.0 00:02:59.112 SYMLINK libspdk_event_accel.so 00:02:59.370 CC module/event/subsystems/bdev/bdev.o 00:02:59.629 LIB libspdk_event_bdev.a 00:02:59.629 SO libspdk_event_bdev.so.6.0 00:02:59.629 SYMLINK libspdk_event_bdev.so 00:03:00.196 CC module/event/subsystems/scsi/scsi.o 00:03:00.196 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:00.196 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:00.196 CC module/event/subsystems/nbd/nbd.o 00:03:00.196 CC module/event/subsystems/ublk/ublk.o 00:03:00.196 LIB libspdk_event_nbd.a 00:03:00.196 LIB libspdk_event_scsi.a 00:03:00.196 LIB libspdk_event_ublk.a 00:03:00.455 SO libspdk_event_nbd.so.6.0 00:03:00.455 LIB libspdk_event_nvmf.a 00:03:00.455 SO libspdk_event_scsi.so.6.0 00:03:00.455 SO libspdk_event_ublk.so.3.0 00:03:00.455 SO libspdk_event_nvmf.so.6.0 00:03:00.455 SYMLINK libspdk_event_nbd.so 00:03:00.455 SYMLINK libspdk_event_scsi.so 00:03:00.455 SYMLINK libspdk_event_ublk.so 00:03:00.455 SYMLINK libspdk_event_nvmf.so 00:03:00.714 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:00.714 CC module/event/subsystems/iscsi/iscsi.o 00:03:00.974 LIB libspdk_event_vhost_scsi.a 00:03:00.974 SO libspdk_event_vhost_scsi.so.3.0 00:03:00.974 LIB libspdk_event_iscsi.a 00:03:00.974 SO libspdk_event_iscsi.so.6.0 00:03:00.974 SYMLINK libspdk_event_vhost_scsi.so 00:03:01.233 SYMLINK libspdk_event_iscsi.so 00:03:01.233 SO libspdk.so.6.0 00:03:01.233 SYMLINK libspdk.so 00:03:01.805 CC app/spdk_nvme_identify/identify.o 00:03:01.805 CC app/spdk_nvme_discover/discovery_aer.o 00:03:01.805 CC app/spdk_top/spdk_top.o 00:03:01.805 CC app/spdk_lspci/spdk_lspci.o 00:03:01.805 CC test/rpc_client/rpc_client_test.o 00:03:01.805 CC app/trace_record/trace_record.o 00:03:01.805 TEST_HEADER include/spdk/accel.h 00:03:01.805 TEST_HEADER include/spdk/accel_module.h 00:03:01.805 TEST_HEADER include/spdk/assert.h 00:03:01.805 TEST_HEADER include/spdk/barrier.h 00:03:01.805 TEST_HEADER include/spdk/base64.h 00:03:01.805 TEST_HEADER include/spdk/bdev.h 00:03:01.805 TEST_HEADER include/spdk/bdev_module.h 00:03:01.805 TEST_HEADER include/spdk/bdev_zone.h 00:03:01.805 TEST_HEADER include/spdk/bit_array.h 00:03:01.805 CXX app/trace/trace.o 00:03:01.805 TEST_HEADER include/spdk/bit_pool.h 00:03:01.805 TEST_HEADER include/spdk/blob_bdev.h 00:03:01.805 CC app/spdk_nvme_perf/perf.o 00:03:01.805 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:01.805 TEST_HEADER include/spdk/blobfs.h 00:03:01.805 TEST_HEADER include/spdk/blob.h 00:03:01.805 TEST_HEADER include/spdk/conf.h 00:03:01.805 TEST_HEADER include/spdk/config.h 00:03:01.805 TEST_HEADER include/spdk/cpuset.h 00:03:01.805 TEST_HEADER include/spdk/crc16.h 00:03:01.805 TEST_HEADER include/spdk/crc64.h 00:03:01.805 TEST_HEADER include/spdk/dif.h 00:03:01.805 TEST_HEADER include/spdk/crc32.h 00:03:01.805 TEST_HEADER include/spdk/dma.h 00:03:01.805 TEST_HEADER include/spdk/endian.h 00:03:01.805 TEST_HEADER include/spdk/env_dpdk.h 00:03:01.805 TEST_HEADER include/spdk/env.h 00:03:01.805 TEST_HEADER include/spdk/event.h 00:03:01.805 TEST_HEADER include/spdk/fd_group.h 00:03:01.805 TEST_HEADER include/spdk/fd.h 00:03:01.805 TEST_HEADER include/spdk/file.h 00:03:01.805 TEST_HEADER include/spdk/ftl.h 00:03:01.805 TEST_HEADER include/spdk/gpt_spec.h 00:03:01.805 TEST_HEADER include/spdk/hexlify.h 00:03:01.805 TEST_HEADER include/spdk/histogram_data.h 00:03:01.805 TEST_HEADER include/spdk/idxd.h 00:03:01.805 TEST_HEADER include/spdk/idxd_spec.h 00:03:01.805 TEST_HEADER include/spdk/init.h 00:03:01.805 TEST_HEADER include/spdk/ioat.h 00:03:01.805 TEST_HEADER include/spdk/ioat_spec.h 00:03:01.805 TEST_HEADER include/spdk/iscsi_spec.h 00:03:01.805 TEST_HEADER include/spdk/json.h 00:03:01.805 TEST_HEADER include/spdk/jsonrpc.h 00:03:01.805 TEST_HEADER include/spdk/keyring.h 00:03:01.805 TEST_HEADER include/spdk/keyring_module.h 00:03:01.805 TEST_HEADER include/spdk/likely.h 00:03:01.805 TEST_HEADER include/spdk/log.h 00:03:01.805 TEST_HEADER include/spdk/lvol.h 00:03:01.805 TEST_HEADER include/spdk/memory.h 00:03:01.805 TEST_HEADER include/spdk/mmio.h 00:03:01.805 TEST_HEADER include/spdk/nbd.h 00:03:01.805 TEST_HEADER include/spdk/notify.h 00:03:01.805 TEST_HEADER include/spdk/nvme_intel.h 00:03:01.805 TEST_HEADER include/spdk/nvme.h 00:03:01.805 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:01.805 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:01.805 TEST_HEADER include/spdk/nvme_spec.h 00:03:01.805 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:01.805 TEST_HEADER include/spdk/nvme_zns.h 00:03:01.805 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:01.805 TEST_HEADER include/spdk/nvmf.h 00:03:01.805 TEST_HEADER include/spdk/nvmf_spec.h 00:03:01.805 TEST_HEADER include/spdk/nvmf_transport.h 00:03:01.805 TEST_HEADER include/spdk/opal.h 00:03:01.805 TEST_HEADER include/spdk/opal_spec.h 00:03:01.805 TEST_HEADER include/spdk/pipe.h 00:03:01.805 TEST_HEADER include/spdk/pci_ids.h 00:03:01.805 TEST_HEADER include/spdk/queue.h 00:03:01.805 TEST_HEADER include/spdk/reduce.h 00:03:01.805 TEST_HEADER include/spdk/rpc.h 00:03:01.805 TEST_HEADER include/spdk/scheduler.h 00:03:01.805 TEST_HEADER include/spdk/scsi.h 00:03:01.805 TEST_HEADER include/spdk/scsi_spec.h 00:03:01.805 CC app/nvmf_tgt/nvmf_main.o 00:03:01.805 TEST_HEADER include/spdk/sock.h 00:03:01.805 TEST_HEADER include/spdk/stdinc.h 00:03:01.805 TEST_HEADER include/spdk/string.h 00:03:01.805 TEST_HEADER include/spdk/thread.h 00:03:01.805 TEST_HEADER include/spdk/trace.h 00:03:01.805 TEST_HEADER include/spdk/trace_parser.h 00:03:01.805 TEST_HEADER include/spdk/tree.h 00:03:01.805 TEST_HEADER include/spdk/ublk.h 00:03:01.805 TEST_HEADER include/spdk/util.h 00:03:01.805 TEST_HEADER include/spdk/uuid.h 00:03:01.806 TEST_HEADER include/spdk/version.h 00:03:01.806 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:01.806 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:01.806 TEST_HEADER include/spdk/vhost.h 00:03:01.806 TEST_HEADER include/spdk/vmd.h 00:03:01.806 TEST_HEADER include/spdk/xor.h 00:03:01.806 TEST_HEADER include/spdk/zipf.h 00:03:01.806 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:01.806 CXX test/cpp_headers/accel.o 00:03:01.806 CXX test/cpp_headers/accel_module.o 00:03:01.806 CXX test/cpp_headers/assert.o 00:03:01.806 CXX test/cpp_headers/barrier.o 00:03:01.806 CXX test/cpp_headers/base64.o 00:03:01.806 CXX test/cpp_headers/bdev.o 00:03:01.806 CXX test/cpp_headers/bdev_module.o 00:03:01.806 CXX test/cpp_headers/bdev_zone.o 00:03:01.806 CXX test/cpp_headers/bit_array.o 00:03:01.806 CXX test/cpp_headers/blob_bdev.o 00:03:01.806 CXX test/cpp_headers/blobfs_bdev.o 00:03:01.806 CXX test/cpp_headers/bit_pool.o 00:03:01.806 CXX test/cpp_headers/blobfs.o 00:03:01.806 CXX test/cpp_headers/blob.o 00:03:01.806 CXX test/cpp_headers/config.o 00:03:01.806 CXX test/cpp_headers/cpuset.o 00:03:01.806 CXX test/cpp_headers/conf.o 00:03:01.806 CXX test/cpp_headers/crc32.o 00:03:01.806 CXX test/cpp_headers/crc16.o 00:03:01.806 CXX test/cpp_headers/crc64.o 00:03:01.806 CXX test/cpp_headers/dma.o 00:03:01.806 CXX test/cpp_headers/dif.o 00:03:01.806 CC app/iscsi_tgt/iscsi_tgt.o 00:03:01.806 CXX test/cpp_headers/env_dpdk.o 00:03:01.806 CXX test/cpp_headers/endian.o 00:03:01.806 CXX test/cpp_headers/env.o 00:03:01.806 CXX test/cpp_headers/event.o 00:03:01.806 CXX test/cpp_headers/fd_group.o 00:03:01.806 CXX test/cpp_headers/fd.o 00:03:01.806 CXX test/cpp_headers/ftl.o 00:03:01.806 CXX test/cpp_headers/file.o 00:03:01.806 CXX test/cpp_headers/hexlify.o 00:03:01.806 CXX test/cpp_headers/gpt_spec.o 00:03:01.806 CXX test/cpp_headers/histogram_data.o 00:03:01.806 CXX test/cpp_headers/idxd.o 00:03:01.806 CXX test/cpp_headers/idxd_spec.o 00:03:01.806 CC examples/util/zipf/zipf.o 00:03:01.806 CXX test/cpp_headers/init.o 00:03:01.806 CXX test/cpp_headers/ioat.o 00:03:01.806 CXX test/cpp_headers/ioat_spec.o 00:03:01.806 CC app/spdk_tgt/spdk_tgt.o 00:03:01.806 CXX test/cpp_headers/iscsi_spec.o 00:03:01.806 CXX test/cpp_headers/json.o 00:03:01.806 CC test/thread/poller_perf/poller_perf.o 00:03:01.806 CC test/env/vtophys/vtophys.o 00:03:01.806 CXX test/cpp_headers/jsonrpc.o 00:03:01.806 CXX test/cpp_headers/keyring.o 00:03:01.806 CC examples/ioat/perf/perf.o 00:03:01.806 CC examples/ioat/verify/verify.o 00:03:01.806 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:01.806 CC test/env/memory/memory_ut.o 00:03:01.806 CC test/app/jsoncat/jsoncat.o 00:03:01.806 CC test/app/stub/stub.o 00:03:01.806 CC test/app/histogram_perf/histogram_perf.o 00:03:01.806 CC app/spdk_dd/spdk_dd.o 00:03:01.806 CC test/env/pci/pci_ut.o 00:03:01.806 CC test/dma/test_dma/test_dma.o 00:03:01.806 CC app/fio/nvme/fio_plugin.o 00:03:02.068 CC test/app/bdev_svc/bdev_svc.o 00:03:02.068 LINK rpc_client_test 00:03:02.068 CC app/fio/bdev/fio_plugin.o 00:03:02.068 LINK spdk_lspci 00:03:02.068 LINK spdk_nvme_discover 00:03:02.068 CC test/env/mem_callbacks/mem_callbacks.o 00:03:02.330 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:02.330 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:02.330 LINK env_dpdk_post_init 00:03:02.330 LINK zipf 00:03:02.330 LINK vtophys 00:03:02.330 CXX test/cpp_headers/keyring_module.o 00:03:02.330 LINK jsoncat 00:03:02.330 LINK histogram_perf 00:03:02.330 LINK nvmf_tgt 00:03:02.330 LINK interrupt_tgt 00:03:02.330 CXX test/cpp_headers/likely.o 00:03:02.330 CXX test/cpp_headers/log.o 00:03:02.330 LINK iscsi_tgt 00:03:02.330 LINK stub 00:03:02.330 CXX test/cpp_headers/lvol.o 00:03:02.330 CXX test/cpp_headers/memory.o 00:03:02.330 CXX test/cpp_headers/mmio.o 00:03:02.330 LINK poller_perf 00:03:02.330 CXX test/cpp_headers/nbd.o 00:03:02.330 CXX test/cpp_headers/notify.o 00:03:02.330 LINK spdk_tgt 00:03:02.330 CXX test/cpp_headers/nvme.o 00:03:02.330 CXX test/cpp_headers/nvme_intel.o 00:03:02.330 CXX test/cpp_headers/nvme_ocssd.o 00:03:02.330 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:02.330 CXX test/cpp_headers/nvme_spec.o 00:03:02.330 CXX test/cpp_headers/nvme_zns.o 00:03:02.330 CXX test/cpp_headers/nvmf_cmd.o 00:03:02.330 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:02.330 CXX test/cpp_headers/nvmf.o 00:03:02.330 CXX test/cpp_headers/nvmf_spec.o 00:03:02.330 CXX test/cpp_headers/nvmf_transport.o 00:03:02.330 LINK spdk_trace_record 00:03:02.330 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:02.330 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:02.330 LINK bdev_svc 00:03:02.591 CXX test/cpp_headers/opal.o 00:03:02.591 CXX test/cpp_headers/opal_spec.o 00:03:02.591 CXX test/cpp_headers/pci_ids.o 00:03:02.591 CXX test/cpp_headers/pipe.o 00:03:02.591 CXX test/cpp_headers/queue.o 00:03:02.591 CXX test/cpp_headers/reduce.o 00:03:02.591 CXX test/cpp_headers/rpc.o 00:03:02.591 CXX test/cpp_headers/scheduler.o 00:03:02.591 CXX test/cpp_headers/scsi.o 00:03:02.591 LINK verify 00:03:02.591 CXX test/cpp_headers/scsi_spec.o 00:03:02.591 CXX test/cpp_headers/sock.o 00:03:02.591 CXX test/cpp_headers/stdinc.o 00:03:02.591 CXX test/cpp_headers/string.o 00:03:02.591 CXX test/cpp_headers/thread.o 00:03:02.591 LINK spdk_trace 00:03:02.591 CXX test/cpp_headers/trace.o 00:03:02.591 CXX test/cpp_headers/trace_parser.o 00:03:02.591 CXX test/cpp_headers/tree.o 00:03:02.591 CXX test/cpp_headers/ublk.o 00:03:02.591 CXX test/cpp_headers/util.o 00:03:02.591 CXX test/cpp_headers/uuid.o 00:03:02.591 CXX test/cpp_headers/version.o 00:03:02.591 CXX test/cpp_headers/vfio_user_pci.o 00:03:02.591 LINK pci_ut 00:03:02.591 CXX test/cpp_headers/vfio_user_spec.o 00:03:02.591 CXX test/cpp_headers/vhost.o 00:03:02.591 CXX test/cpp_headers/vmd.o 00:03:02.591 CXX test/cpp_headers/xor.o 00:03:02.591 CXX test/cpp_headers/zipf.o 00:03:02.924 LINK spdk_dd 00:03:02.924 LINK ioat_perf 00:03:02.924 CC examples/vmd/led/led.o 00:03:02.924 CC examples/vmd/lsvmd/lsvmd.o 00:03:02.924 LINK nvme_fuzz 00:03:02.924 CC examples/sock/hello_world/hello_sock.o 00:03:02.924 CC examples/idxd/perf/perf.o 00:03:02.924 CC test/event/reactor_perf/reactor_perf.o 00:03:02.924 CC test/event/reactor/reactor.o 00:03:02.924 CC test/event/event_perf/event_perf.o 00:03:02.924 CC test/event/app_repeat/app_repeat.o 00:03:03.183 CC examples/thread/thread/thread_ex.o 00:03:03.183 LINK spdk_bdev 00:03:03.183 CC test/event/scheduler/scheduler.o 00:03:03.183 LINK lsvmd 00:03:03.183 LINK mem_callbacks 00:03:03.183 CC app/vhost/vhost.o 00:03:03.183 LINK spdk_nvme_perf 00:03:03.183 LINK vhost_fuzz 00:03:03.183 LINK reactor_perf 00:03:03.183 LINK reactor 00:03:03.183 LINK spdk_nvme 00:03:03.183 LINK event_perf 00:03:03.183 LINK spdk_nvme_identify 00:03:03.183 LINK led 00:03:03.183 LINK app_repeat 00:03:03.183 LINK test_dma 00:03:03.441 LINK idxd_perf 00:03:03.441 LINK scheduler 00:03:03.441 LINK thread 00:03:03.441 LINK hello_sock 00:03:03.441 LINK vhost 00:03:03.700 LINK memory_ut 00:03:03.959 CC test/nvme/aer/aer.o 00:03:03.959 CC test/nvme/err_injection/err_injection.o 00:03:03.959 CC test/nvme/connect_stress/connect_stress.o 00:03:03.959 CC test/nvme/reserve/reserve.o 00:03:03.959 CC test/nvme/sgl/sgl.o 00:03:03.959 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:03.959 CC test/nvme/overhead/overhead.o 00:03:03.959 CC test/nvme/reset/reset.o 00:03:03.959 CC test/nvme/compliance/nvme_compliance.o 00:03:03.959 CC test/nvme/e2edp/nvme_dp.o 00:03:03.959 CC test/nvme/fused_ordering/fused_ordering.o 00:03:03.959 CC test/nvme/boot_partition/boot_partition.o 00:03:03.959 CC test/nvme/simple_copy/simple_copy.o 00:03:03.959 CC examples/nvme/reconnect/reconnect.o 00:03:03.959 CC test/nvme/fdp/fdp.o 00:03:03.959 CC examples/nvme/hotplug/hotplug.o 00:03:03.959 CC test/nvme/cuse/cuse.o 00:03:03.959 CC examples/nvme/abort/abort.o 00:03:03.959 CC test/nvme/startup/startup.o 00:03:03.959 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:03.959 CC examples/nvme/arbitration/arbitration.o 00:03:03.959 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:03.959 CC examples/nvme/hello_world/hello_world.o 00:03:03.959 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:03.959 CC test/accel/dif/dif.o 00:03:03.959 CC test/blobfs/mkfs/mkfs.o 00:03:03.959 CC examples/accel/perf/accel_perf.o 00:03:03.959 CC examples/blob/cli/blobcli.o 00:03:03.959 CC examples/blob/hello_world/hello_blob.o 00:03:03.959 CC test/lvol/esnap/esnap.o 00:03:03.959 LINK reserve 00:03:03.959 LINK boot_partition 00:03:03.959 LINK err_injection 00:03:04.217 LINK doorbell_aers 00:03:04.217 LINK connect_stress 00:03:04.217 LINK startup 00:03:04.217 LINK cmb_copy 00:03:04.217 LINK fused_ordering 00:03:04.217 LINK aer 00:03:04.217 LINK pmr_persistence 00:03:04.217 LINK simple_copy 00:03:04.217 LINK hotplug 00:03:04.217 LINK reset 00:03:04.217 LINK nvme_dp 00:03:04.217 LINK hello_world 00:03:04.217 LINK overhead 00:03:04.217 LINK nvme_compliance 00:03:04.217 LINK reconnect 00:03:04.217 LINK arbitration 00:03:04.217 LINK fdp 00:03:04.217 LINK mkfs 00:03:04.217 LINK abort 00:03:04.217 LINK iscsi_fuzz 00:03:04.217 LINK dif 00:03:04.217 LINK hello_blob 00:03:04.477 LINK spdk_top 00:03:04.477 LINK nvme_manage 00:03:04.477 LINK accel_perf 00:03:04.477 LINK sgl 00:03:04.735 LINK blobcli 00:03:04.994 CC test/bdev/bdevio/bdevio.o 00:03:04.994 CC examples/bdev/bdevperf/bdevperf.o 00:03:04.994 CC examples/bdev/hello_world/hello_bdev.o 00:03:05.253 LINK cuse 00:03:05.253 LINK bdevio 00:03:05.512 LINK hello_bdev 00:03:05.772 LINK bdevperf 00:03:06.341 CC examples/nvmf/nvmf/nvmf.o 00:03:06.909 LINK nvmf 00:03:09.442 LINK esnap 00:03:09.442 00:03:09.442 real 1m33.266s 00:03:09.442 user 18m2.742s 00:03:09.442 sys 4m23.393s 00:03:09.442 20:17:01 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:03:09.442 20:17:01 make -- common/autotest_common.sh@10 -- $ set +x 00:03:09.442 ************************************ 00:03:09.442 END TEST make 00:03:09.442 ************************************ 00:03:09.700 20:17:01 -- common/autotest_common.sh@1142 -- $ return 0 00:03:09.700 20:17:01 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:09.700 20:17:01 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:09.700 20:17:01 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:09.700 20:17:01 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:09.700 20:17:01 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:09.700 20:17:01 -- pm/common@44 -- $ pid=1187949 00:03:09.700 20:17:01 -- pm/common@50 -- $ kill -TERM 1187949 00:03:09.700 20:17:01 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:09.700 20:17:01 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:09.700 20:17:01 -- pm/common@44 -- $ pid=1187951 00:03:09.700 20:17:01 -- pm/common@50 -- $ kill -TERM 1187951 00:03:09.700 20:17:01 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:09.700 20:17:01 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:09.700 20:17:01 -- pm/common@44 -- $ pid=1187953 00:03:09.700 20:17:01 -- pm/common@50 -- $ kill -TERM 1187953 00:03:09.700 20:17:01 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:09.700 20:17:01 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:09.700 20:17:01 -- pm/common@44 -- $ pid=1187976 00:03:09.700 20:17:01 -- pm/common@50 -- $ sudo -E kill -TERM 1187976 00:03:09.700 20:17:01 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:09.700 20:17:01 -- nvmf/common.sh@7 -- # uname -s 00:03:09.700 20:17:01 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:09.700 20:17:01 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:09.700 20:17:01 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:09.700 20:17:01 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:09.700 20:17:01 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:09.700 20:17:01 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:09.700 20:17:01 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:09.700 20:17:01 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:09.700 20:17:01 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:09.700 20:17:01 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:09.700 20:17:01 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:03:09.700 20:17:01 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:03:09.700 20:17:01 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:09.700 20:17:01 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:09.700 20:17:01 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:09.700 20:17:01 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:09.700 20:17:01 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:09.700 20:17:01 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:09.700 20:17:01 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:09.700 20:17:01 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:09.700 20:17:01 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:09.701 20:17:01 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:09.701 20:17:01 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:09.701 20:17:01 -- paths/export.sh@5 -- # export PATH 00:03:09.701 20:17:01 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:09.701 20:17:01 -- nvmf/common.sh@47 -- # : 0 00:03:09.701 20:17:01 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:09.701 20:17:01 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:09.701 20:17:01 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:09.701 20:17:01 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:09.701 20:17:01 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:09.701 20:17:01 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:09.701 20:17:01 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:09.701 20:17:01 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:09.701 20:17:01 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:09.701 20:17:01 -- spdk/autotest.sh@32 -- # uname -s 00:03:09.701 20:17:02 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:09.701 20:17:02 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:09.701 20:17:02 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:09.701 20:17:02 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:09.701 20:17:02 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:09.701 20:17:02 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:09.701 20:17:02 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:09.701 20:17:02 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:09.701 20:17:02 -- spdk/autotest.sh@48 -- # udevadm_pid=1255319 00:03:09.701 20:17:02 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:09.701 20:17:02 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:09.701 20:17:02 -- pm/common@17 -- # local monitor 00:03:09.701 20:17:02 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:09.701 20:17:02 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:09.701 20:17:02 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:09.701 20:17:02 -- pm/common@21 -- # date +%s 00:03:09.701 20:17:02 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:09.701 20:17:02 -- pm/common@21 -- # date +%s 00:03:09.701 20:17:02 -- pm/common@25 -- # sleep 1 00:03:09.701 20:17:02 -- pm/common@21 -- # date +%s 00:03:09.701 20:17:02 -- pm/common@21 -- # date +%s 00:03:09.701 20:17:02 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721067422 00:03:09.701 20:17:02 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721067422 00:03:09.701 20:17:02 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721067422 00:03:09.701 20:17:02 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721067422 00:03:09.701 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721067422_collect-vmstat.pm.log 00:03:09.959 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721067422_collect-cpu-load.pm.log 00:03:09.959 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721067422_collect-cpu-temp.pm.log 00:03:09.959 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721067422_collect-bmc-pm.bmc.pm.log 00:03:10.894 20:17:03 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:10.894 20:17:03 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:10.894 20:17:03 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:10.894 20:17:03 -- common/autotest_common.sh@10 -- # set +x 00:03:10.894 20:17:03 -- spdk/autotest.sh@59 -- # create_test_list 00:03:10.894 20:17:03 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:10.894 20:17:03 -- common/autotest_common.sh@10 -- # set +x 00:03:10.894 20:17:03 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:10.894 20:17:03 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:10.894 20:17:03 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:10.894 20:17:03 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:10.894 20:17:03 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:10.894 20:17:03 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:10.894 20:17:03 -- common/autotest_common.sh@1455 -- # uname 00:03:10.894 20:17:03 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:10.894 20:17:03 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:10.894 20:17:03 -- common/autotest_common.sh@1475 -- # uname 00:03:10.894 20:17:03 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:10.894 20:17:03 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:10.894 20:17:03 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:10.894 20:17:03 -- spdk/autotest.sh@72 -- # hash lcov 00:03:10.894 20:17:03 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:10.894 20:17:03 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:10.894 --rc lcov_branch_coverage=1 00:03:10.894 --rc lcov_function_coverage=1 00:03:10.894 --rc genhtml_branch_coverage=1 00:03:10.894 --rc genhtml_function_coverage=1 00:03:10.894 --rc genhtml_legend=1 00:03:10.894 --rc geninfo_all_blocks=1 00:03:10.894 ' 00:03:10.894 20:17:03 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:10.894 --rc lcov_branch_coverage=1 00:03:10.894 --rc lcov_function_coverage=1 00:03:10.894 --rc genhtml_branch_coverage=1 00:03:10.894 --rc genhtml_function_coverage=1 00:03:10.894 --rc genhtml_legend=1 00:03:10.894 --rc geninfo_all_blocks=1 00:03:10.894 ' 00:03:10.894 20:17:03 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:10.894 --rc lcov_branch_coverage=1 00:03:10.894 --rc lcov_function_coverage=1 00:03:10.894 --rc genhtml_branch_coverage=1 00:03:10.894 --rc genhtml_function_coverage=1 00:03:10.894 --rc genhtml_legend=1 00:03:10.894 --rc geninfo_all_blocks=1 00:03:10.894 --no-external' 00:03:10.894 20:17:03 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:10.894 --rc lcov_branch_coverage=1 00:03:10.894 --rc lcov_function_coverage=1 00:03:10.894 --rc genhtml_branch_coverage=1 00:03:10.895 --rc genhtml_function_coverage=1 00:03:10.895 --rc genhtml_legend=1 00:03:10.895 --rc geninfo_all_blocks=1 00:03:10.895 --no-external' 00:03:10.895 20:17:03 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:10.895 lcov: LCOV version 1.14 00:03:10.895 20:17:03 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:32.851 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:32.851 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:50.992 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:50.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:50.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:50.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:50.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:55.187 20:17:47 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:55.187 20:17:47 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:55.187 20:17:47 -- common/autotest_common.sh@10 -- # set +x 00:03:55.447 20:17:47 -- spdk/autotest.sh@91 -- # rm -f 00:03:55.447 20:17:47 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:58.755 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:58.755 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:59.014 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:03:59.014 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:59.014 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:59.014 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:59.014 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:59.014 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:59.014 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:59.014 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:59.274 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:59.274 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:59.274 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:59.274 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:59.274 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:59.274 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:59.274 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:59.274 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:59.274 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:59.533 20:17:51 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:59.533 20:17:51 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:59.533 20:17:51 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:59.533 20:17:51 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:59.533 20:17:51 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:59.533 20:17:51 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:59.533 20:17:51 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:59.534 20:17:51 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:59.534 20:17:51 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:59.534 20:17:51 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:59.534 20:17:51 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:59.534 20:17:51 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:59.534 20:17:51 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:59.534 20:17:51 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:59.534 20:17:51 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:59.534 No valid GPT data, bailing 00:03:59.534 20:17:51 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:59.534 20:17:51 -- scripts/common.sh@391 -- # pt= 00:03:59.534 20:17:51 -- scripts/common.sh@392 -- # return 1 00:03:59.534 20:17:51 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:59.534 1+0 records in 00:03:59.534 1+0 records out 00:03:59.534 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00503409 s, 208 MB/s 00:03:59.534 20:17:51 -- spdk/autotest.sh@118 -- # sync 00:03:59.534 20:17:51 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:59.534 20:17:51 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:59.534 20:17:51 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:04.808 20:17:56 -- spdk/autotest.sh@124 -- # uname -s 00:04:04.809 20:17:56 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:04:04.809 20:17:56 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:04.809 20:17:56 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:04.809 20:17:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:04.809 20:17:56 -- common/autotest_common.sh@10 -- # set +x 00:04:04.809 ************************************ 00:04:04.809 START TEST setup.sh 00:04:04.809 ************************************ 00:04:04.809 20:17:56 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:04.809 * Looking for test storage... 00:04:04.809 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:04.809 20:17:56 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:04.809 20:17:56 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:04.809 20:17:56 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:04.809 20:17:56 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:04.809 20:17:56 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:04.809 20:17:56 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:04.809 ************************************ 00:04:04.809 START TEST acl 00:04:04.809 ************************************ 00:04:04.809 20:17:56 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:04.809 * Looking for test storage... 00:04:04.809 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:04.809 20:17:57 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:04.809 20:17:57 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:04.809 20:17:57 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:04.809 20:17:57 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:04.809 20:17:57 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:04.809 20:17:57 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:04.809 20:17:57 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:04.809 20:17:57 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:04.809 20:17:57 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:04.809 20:17:57 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:04.809 20:17:57 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:04.809 20:17:57 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:04.809 20:17:57 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:04.809 20:17:57 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:04.809 20:17:57 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:04.809 20:17:57 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:08.995 20:18:01 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:08.995 20:18:01 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:08.995 20:18:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:08.995 20:18:01 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:08.995 20:18:01 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.995 20:18:01 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 Hugepages 00:04:13.188 node hugesize free / total 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 00:04:13.188 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.188 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:04:13.189 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:13.189 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.189 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.189 20:18:05 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:04:13.189 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:13.189 20:18:05 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.189 20:18:05 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.189 20:18:05 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:13.189 20:18:05 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:13.189 20:18:05 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:13.189 20:18:05 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:13.189 20:18:05 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:13.189 ************************************ 00:04:13.189 START TEST denied 00:04:13.189 ************************************ 00:04:13.189 20:18:05 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:04:13.189 20:18:05 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:04:13.189 20:18:05 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:13.189 20:18:05 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:04:13.189 20:18:05 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.189 20:18:05 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:17.459 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:04:17.459 20:18:09 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:04:17.459 20:18:09 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:17.459 20:18:09 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:17.459 20:18:09 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:04:17.459 20:18:09 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:04:17.459 20:18:09 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:17.459 20:18:09 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:17.459 20:18:09 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:17.459 20:18:09 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:17.459 20:18:09 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:22.733 00:04:22.733 real 0m9.193s 00:04:22.733 user 0m2.934s 00:04:22.733 sys 0m5.565s 00:04:22.733 20:18:14 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:22.733 20:18:14 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:22.733 ************************************ 00:04:22.733 END TEST denied 00:04:22.733 ************************************ 00:04:22.733 20:18:14 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:22.733 20:18:14 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:22.733 20:18:14 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:22.733 20:18:14 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:22.733 20:18:14 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:22.733 ************************************ 00:04:22.733 START TEST allowed 00:04:22.733 ************************************ 00:04:22.733 20:18:14 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:04:22.733 20:18:14 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:04:22.733 20:18:14 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:22.733 20:18:14 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:04:22.733 20:18:14 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.733 20:18:14 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:29.307 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:29.307 20:18:20 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:29.307 20:18:20 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:29.307 20:18:20 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:29.307 20:18:20 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:29.307 20:18:20 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:33.505 00:04:33.505 real 0m10.708s 00:04:33.505 user 0m2.927s 00:04:33.505 sys 0m5.295s 00:04:33.505 20:18:25 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:33.505 20:18:25 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:33.505 ************************************ 00:04:33.505 END TEST allowed 00:04:33.505 ************************************ 00:04:33.505 20:18:25 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:33.505 00:04:33.505 real 0m28.238s 00:04:33.505 user 0m8.888s 00:04:33.505 sys 0m16.481s 00:04:33.505 20:18:25 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:33.505 20:18:25 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:33.505 ************************************ 00:04:33.505 END TEST acl 00:04:33.505 ************************************ 00:04:33.505 20:18:25 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:33.505 20:18:25 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:33.505 20:18:25 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:33.505 20:18:25 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:33.505 20:18:25 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:33.505 ************************************ 00:04:33.505 START TEST hugepages 00:04:33.505 ************************************ 00:04:33.505 20:18:25 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:33.505 * Looking for test storage... 00:04:33.505 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 76657524 kB' 'MemAvailable: 79958300 kB' 'Buffers: 12176 kB' 'Cached: 9561488 kB' 'SwapCached: 0 kB' 'Active: 6619320 kB' 'Inactive: 3456260 kB' 'Active(anon): 6225736 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 505236 kB' 'Mapped: 199756 kB' 'Shmem: 5723820 kB' 'KReclaimable: 209148 kB' 'Slab: 540868 kB' 'SReclaimable: 209148 kB' 'SUnreclaim: 331720 kB' 'KernelStack: 16112 kB' 'PageTables: 8392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438188 kB' 'Committed_AS: 7648116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.505 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.506 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:33.507 20:18:25 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:33.507 20:18:25 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:33.507 20:18:25 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:33.507 20:18:25 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:33.507 ************************************ 00:04:33.507 START TEST default_setup 00:04:33.507 ************************************ 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.507 20:18:25 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:36.801 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:36.801 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:36.801 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:36.801 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:37.060 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:37.060 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:37.060 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:37.060 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:37.060 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:37.060 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:37.060 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:37.060 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:37.060 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:37.320 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:37.320 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:37.320 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:37.320 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:37.320 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:39.863 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78781708 kB' 'MemAvailable: 82082420 kB' 'Buffers: 12176 kB' 'Cached: 9561608 kB' 'SwapCached: 0 kB' 'Active: 6639768 kB' 'Inactive: 3456260 kB' 'Active(anon): 6246184 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525468 kB' 'Mapped: 199928 kB' 'Shmem: 5723940 kB' 'KReclaimable: 209020 kB' 'Slab: 540004 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330984 kB' 'KernelStack: 16432 kB' 'PageTables: 9276 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7666224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201208 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.863 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78781016 kB' 'MemAvailable: 82081728 kB' 'Buffers: 12176 kB' 'Cached: 9561612 kB' 'SwapCached: 0 kB' 'Active: 6639212 kB' 'Inactive: 3456260 kB' 'Active(anon): 6245628 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525168 kB' 'Mapped: 199868 kB' 'Shmem: 5723944 kB' 'KReclaimable: 209020 kB' 'Slab: 540124 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 331104 kB' 'KernelStack: 16432 kB' 'PageTables: 8624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7666244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.864 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.865 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78780436 kB' 'MemAvailable: 82081148 kB' 'Buffers: 12176 kB' 'Cached: 9561624 kB' 'SwapCached: 0 kB' 'Active: 6638668 kB' 'Inactive: 3456260 kB' 'Active(anon): 6245084 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524496 kB' 'Mapped: 199868 kB' 'Shmem: 5723956 kB' 'KReclaimable: 209020 kB' 'Slab: 540124 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 331104 kB' 'KernelStack: 16320 kB' 'PageTables: 8512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7667676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.866 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.867 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:39.868 nr_hugepages=1024 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:39.868 resv_hugepages=0 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:39.868 surplus_hugepages=0 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:39.868 anon_hugepages=0 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78781016 kB' 'MemAvailable: 82081728 kB' 'Buffers: 12176 kB' 'Cached: 9561652 kB' 'SwapCached: 0 kB' 'Active: 6639640 kB' 'Inactive: 3456260 kB' 'Active(anon): 6246056 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525436 kB' 'Mapped: 199868 kB' 'Shmem: 5723984 kB' 'KReclaimable: 209020 kB' 'Slab: 540028 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 331008 kB' 'KernelStack: 16320 kB' 'PageTables: 8740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7698712 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.868 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.869 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36905908 kB' 'MemUsed: 11211032 kB' 'SwapCached: 0 kB' 'Active: 5000864 kB' 'Inactive: 3372048 kB' 'Active(anon): 4842960 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8118288 kB' 'Mapped: 106600 kB' 'AnonPages: 257780 kB' 'Shmem: 4588336 kB' 'KernelStack: 8840 kB' 'PageTables: 4864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 123564 kB' 'Slab: 331984 kB' 'SReclaimable: 123564 kB' 'SUnreclaim: 208420 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.870 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:39.871 node0=1024 expecting 1024 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:39.871 00:04:39.871 real 0m6.723s 00:04:39.871 user 0m1.693s 00:04:39.871 sys 0m2.760s 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:39.871 20:18:32 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:39.871 ************************************ 00:04:39.871 END TEST default_setup 00:04:39.871 ************************************ 00:04:40.131 20:18:32 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:40.131 20:18:32 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:40.131 20:18:32 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:40.131 20:18:32 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:40.131 20:18:32 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:40.131 ************************************ 00:04:40.131 START TEST per_node_1G_alloc 00:04:40.131 ************************************ 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:40.131 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:40.132 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:40.132 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:40.132 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:40.132 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:40.132 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:40.132 20:18:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:43.456 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:43.456 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:43.456 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:43.456 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:43.456 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78771932 kB' 'MemAvailable: 82072644 kB' 'Buffers: 12176 kB' 'Cached: 9561748 kB' 'SwapCached: 0 kB' 'Active: 6637344 kB' 'Inactive: 3456260 kB' 'Active(anon): 6243760 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522840 kB' 'Mapped: 198792 kB' 'Shmem: 5724080 kB' 'KReclaimable: 209020 kB' 'Slab: 539676 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330656 kB' 'KernelStack: 16128 kB' 'PageTables: 8168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7659024 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.721 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.722 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78772332 kB' 'MemAvailable: 82073044 kB' 'Buffers: 12176 kB' 'Cached: 9561752 kB' 'SwapCached: 0 kB' 'Active: 6637508 kB' 'Inactive: 3456260 kB' 'Active(anon): 6243924 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523076 kB' 'Mapped: 198764 kB' 'Shmem: 5724084 kB' 'KReclaimable: 209020 kB' 'Slab: 539680 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330660 kB' 'KernelStack: 16128 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7659044 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.723 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.724 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78772368 kB' 'MemAvailable: 82073080 kB' 'Buffers: 12176 kB' 'Cached: 9561788 kB' 'SwapCached: 0 kB' 'Active: 6637640 kB' 'Inactive: 3456260 kB' 'Active(anon): 6244056 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523116 kB' 'Mapped: 198764 kB' 'Shmem: 5724120 kB' 'KReclaimable: 209020 kB' 'Slab: 539680 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330660 kB' 'KernelStack: 16128 kB' 'PageTables: 8176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7660360 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.725 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.726 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:43.727 nr_hugepages=1024 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:43.727 resv_hugepages=0 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:43.727 surplus_hugepages=0 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:43.727 anon_hugepages=0 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.727 20:18:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78772768 kB' 'MemAvailable: 82073480 kB' 'Buffers: 12176 kB' 'Cached: 9561792 kB' 'SwapCached: 0 kB' 'Active: 6637788 kB' 'Inactive: 3456260 kB' 'Active(anon): 6244204 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523376 kB' 'Mapped: 198764 kB' 'Shmem: 5724124 kB' 'KReclaimable: 209020 kB' 'Slab: 539680 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330660 kB' 'KernelStack: 16080 kB' 'PageTables: 8044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7660208 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.727 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.728 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37954680 kB' 'MemUsed: 10162260 kB' 'SwapCached: 0 kB' 'Active: 5000496 kB' 'Inactive: 3372048 kB' 'Active(anon): 4842592 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8118400 kB' 'Mapped: 106308 kB' 'AnonPages: 257372 kB' 'Shmem: 4588448 kB' 'KernelStack: 8744 kB' 'PageTables: 4468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 123564 kB' 'Slab: 331592 kB' 'SReclaimable: 123564 kB' 'SUnreclaim: 208028 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.729 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:43.730 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 40816596 kB' 'MemUsed: 3359936 kB' 'SwapCached: 0 kB' 'Active: 1637728 kB' 'Inactive: 84212 kB' 'Active(anon): 1402048 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1455592 kB' 'Mapped: 92456 kB' 'AnonPages: 266484 kB' 'Shmem: 1135700 kB' 'KernelStack: 7560 kB' 'PageTables: 3828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 85456 kB' 'Slab: 208088 kB' 'SReclaimable: 85456 kB' 'SUnreclaim: 122632 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.731 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:43.732 node0=512 expecting 512 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:43.732 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:43.990 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:43.990 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:43.990 node1=512 expecting 512 00:04:43.990 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:43.990 00:04:43.990 real 0m3.803s 00:04:43.990 user 0m1.444s 00:04:43.990 sys 0m2.450s 00:04:43.990 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:43.990 20:18:36 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:43.990 ************************************ 00:04:43.990 END TEST per_node_1G_alloc 00:04:43.990 ************************************ 00:04:43.990 20:18:36 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:43.990 20:18:36 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:43.990 20:18:36 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:43.990 20:18:36 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.990 20:18:36 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:43.990 ************************************ 00:04:43.990 START TEST even_2G_alloc 00:04:43.990 ************************************ 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:43.990 20:18:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:48.208 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:48.208 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:48.208 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:48.208 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:48.208 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78769060 kB' 'MemAvailable: 82069772 kB' 'Buffers: 12176 kB' 'Cached: 9561900 kB' 'SwapCached: 0 kB' 'Active: 6638328 kB' 'Inactive: 3456260 kB' 'Active(anon): 6244744 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523656 kB' 'Mapped: 198852 kB' 'Shmem: 5724232 kB' 'KReclaimable: 209020 kB' 'Slab: 539776 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330756 kB' 'KernelStack: 16112 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7659564 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.209 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78768764 kB' 'MemAvailable: 82069476 kB' 'Buffers: 12176 kB' 'Cached: 9561904 kB' 'SwapCached: 0 kB' 'Active: 6638076 kB' 'Inactive: 3456260 kB' 'Active(anon): 6244492 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523408 kB' 'Mapped: 198776 kB' 'Shmem: 5724236 kB' 'KReclaimable: 209020 kB' 'Slab: 539836 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330816 kB' 'KernelStack: 16128 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7659580 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.210 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.211 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78768716 kB' 'MemAvailable: 82069428 kB' 'Buffers: 12176 kB' 'Cached: 9561920 kB' 'SwapCached: 0 kB' 'Active: 6638112 kB' 'Inactive: 3456260 kB' 'Active(anon): 6244528 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523408 kB' 'Mapped: 198776 kB' 'Shmem: 5724252 kB' 'KReclaimable: 209020 kB' 'Slab: 539836 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330816 kB' 'KernelStack: 16128 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7659600 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.212 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.213 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:48.214 nr_hugepages=1024 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:48.214 resv_hugepages=0 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:48.214 surplus_hugepages=0 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:48.214 anon_hugepages=0 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78770744 kB' 'MemAvailable: 82071456 kB' 'Buffers: 12176 kB' 'Cached: 9561944 kB' 'SwapCached: 0 kB' 'Active: 6638228 kB' 'Inactive: 3456260 kB' 'Active(anon): 6244644 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523504 kB' 'Mapped: 198776 kB' 'Shmem: 5724276 kB' 'KReclaimable: 209020 kB' 'Slab: 539840 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330820 kB' 'KernelStack: 16112 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7659624 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.214 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:48.215 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37944548 kB' 'MemUsed: 10172392 kB' 'SwapCached: 0 kB' 'Active: 5001412 kB' 'Inactive: 3372048 kB' 'Active(anon): 4843508 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8118528 kB' 'Mapped: 106328 kB' 'AnonPages: 258192 kB' 'Shmem: 4588576 kB' 'KernelStack: 8760 kB' 'PageTables: 4520 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 123564 kB' 'Slab: 331616 kB' 'SReclaimable: 123564 kB' 'SUnreclaim: 208052 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.216 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 40829324 kB' 'MemUsed: 3347208 kB' 'SwapCached: 0 kB' 'Active: 1637328 kB' 'Inactive: 84212 kB' 'Active(anon): 1401648 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1455612 kB' 'Mapped: 92456 kB' 'AnonPages: 266000 kB' 'Shmem: 1135720 kB' 'KernelStack: 7464 kB' 'PageTables: 3564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 85456 kB' 'Slab: 208224 kB' 'SReclaimable: 85456 kB' 'SUnreclaim: 122768 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.217 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:48.218 node0=512 expecting 512 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:48.218 node1=512 expecting 512 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:48.218 00:04:48.218 real 0m3.942s 00:04:48.218 user 0m1.523s 00:04:48.218 sys 0m2.528s 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:48.218 20:18:40 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:48.218 ************************************ 00:04:48.218 END TEST even_2G_alloc 00:04:48.218 ************************************ 00:04:48.218 20:18:40 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:48.218 20:18:40 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:48.218 20:18:40 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:48.218 20:18:40 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:48.219 20:18:40 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:48.219 ************************************ 00:04:48.219 START TEST odd_alloc 00:04:48.219 ************************************ 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:48.219 20:18:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:51.536 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:51.536 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:51.536 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:51.536 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:51.536 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78764376 kB' 'MemAvailable: 82065088 kB' 'Buffers: 12176 kB' 'Cached: 9562060 kB' 'SwapCached: 0 kB' 'Active: 6639556 kB' 'Inactive: 3456260 kB' 'Active(anon): 6245972 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524396 kB' 'Mapped: 198888 kB' 'Shmem: 5724392 kB' 'KReclaimable: 209020 kB' 'Slab: 540028 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 331008 kB' 'KernelStack: 16128 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7660264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.800 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78764064 kB' 'MemAvailable: 82064776 kB' 'Buffers: 12176 kB' 'Cached: 9562076 kB' 'SwapCached: 0 kB' 'Active: 6638668 kB' 'Inactive: 3456260 kB' 'Active(anon): 6245084 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523896 kB' 'Mapped: 198784 kB' 'Shmem: 5724408 kB' 'KReclaimable: 209020 kB' 'Slab: 540068 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 331048 kB' 'KernelStack: 16112 kB' 'PageTables: 8120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7660280 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.801 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78767056 kB' 'MemAvailable: 82067768 kB' 'Buffers: 12176 kB' 'Cached: 9562076 kB' 'SwapCached: 0 kB' 'Active: 6639052 kB' 'Inactive: 3456260 kB' 'Active(anon): 6245468 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524300 kB' 'Mapped: 198784 kB' 'Shmem: 5724408 kB' 'KReclaimable: 209020 kB' 'Slab: 540044 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 331024 kB' 'KernelStack: 16128 kB' 'PageTables: 8168 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7660300 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.802 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:51.803 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:51.804 nr_hugepages=1025 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:51.804 resv_hugepages=0 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:51.804 surplus_hugepages=0 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:51.804 anon_hugepages=0 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78766568 kB' 'MemAvailable: 82067280 kB' 'Buffers: 12176 kB' 'Cached: 9562116 kB' 'SwapCached: 0 kB' 'Active: 6638736 kB' 'Inactive: 3456260 kB' 'Active(anon): 6245152 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523896 kB' 'Mapped: 198784 kB' 'Shmem: 5724448 kB' 'KReclaimable: 209020 kB' 'Slab: 540044 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 331024 kB' 'KernelStack: 16112 kB' 'PageTables: 8120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7660320 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.804 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37941488 kB' 'MemUsed: 10175452 kB' 'SwapCached: 0 kB' 'Active: 5001500 kB' 'Inactive: 3372048 kB' 'Active(anon): 4843596 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8118652 kB' 'Mapped: 106328 kB' 'AnonPages: 258048 kB' 'Shmem: 4588700 kB' 'KernelStack: 8728 kB' 'PageTables: 4468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 123564 kB' 'Slab: 331592 kB' 'SReclaimable: 123564 kB' 'SUnreclaim: 208028 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.805 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 40827544 kB' 'MemUsed: 3348988 kB' 'SwapCached: 0 kB' 'Active: 1637744 kB' 'Inactive: 84212 kB' 'Active(anon): 1402064 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1455644 kB' 'Mapped: 92456 kB' 'AnonPages: 266380 kB' 'Shmem: 1135752 kB' 'KernelStack: 7368 kB' 'PageTables: 3612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 85456 kB' 'Slab: 208452 kB' 'SReclaimable: 85456 kB' 'SUnreclaim: 122996 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.806 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:51.807 node0=512 expecting 513 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:51.807 node1=513 expecting 512 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:51.807 00:04:51.807 real 0m3.966s 00:04:51.807 user 0m1.551s 00:04:51.807 sys 0m2.524s 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:51.807 20:18:44 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:51.807 ************************************ 00:04:51.807 END TEST odd_alloc 00:04:51.807 ************************************ 00:04:52.066 20:18:44 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:52.066 20:18:44 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:52.066 20:18:44 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:52.066 20:18:44 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:52.066 20:18:44 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:52.066 ************************************ 00:04:52.066 START TEST custom_alloc 00:04:52.066 ************************************ 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:52.066 20:18:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:55.348 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:55.348 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:55.348 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:55.348 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:55.348 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:55.348 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:55.348 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:55.348 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:55.610 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77718912 kB' 'MemAvailable: 81019624 kB' 'Buffers: 12176 kB' 'Cached: 9562204 kB' 'SwapCached: 0 kB' 'Active: 6639904 kB' 'Inactive: 3456260 kB' 'Active(anon): 6246320 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524928 kB' 'Mapped: 198828 kB' 'Shmem: 5724536 kB' 'KReclaimable: 209020 kB' 'Slab: 539084 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330064 kB' 'KernelStack: 16112 kB' 'PageTables: 8116 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7661088 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.610 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.611 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77719340 kB' 'MemAvailable: 81020052 kB' 'Buffers: 12176 kB' 'Cached: 9562208 kB' 'SwapCached: 0 kB' 'Active: 6639620 kB' 'Inactive: 3456260 kB' 'Active(anon): 6246036 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524660 kB' 'Mapped: 198800 kB' 'Shmem: 5724540 kB' 'KReclaimable: 209020 kB' 'Slab: 539116 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330096 kB' 'KernelStack: 16128 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7661108 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.612 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77719340 kB' 'MemAvailable: 81020052 kB' 'Buffers: 12176 kB' 'Cached: 9562224 kB' 'SwapCached: 0 kB' 'Active: 6639660 kB' 'Inactive: 3456260 kB' 'Active(anon): 6246076 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524664 kB' 'Mapped: 198800 kB' 'Shmem: 5724556 kB' 'KReclaimable: 209020 kB' 'Slab: 539116 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330096 kB' 'KernelStack: 16128 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7661128 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.613 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.614 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:55.878 nr_hugepages=1536 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:55.878 resv_hugepages=0 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:55.878 surplus_hugepages=0 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:55.878 anon_hugepages=0 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:55.878 20:18:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77720520 kB' 'MemAvailable: 81021232 kB' 'Buffers: 12176 kB' 'Cached: 9562264 kB' 'SwapCached: 0 kB' 'Active: 6639328 kB' 'Inactive: 3456260 kB' 'Active(anon): 6245744 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524264 kB' 'Mapped: 198800 kB' 'Shmem: 5724596 kB' 'KReclaimable: 209020 kB' 'Slab: 539116 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330096 kB' 'KernelStack: 16112 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7661152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37939884 kB' 'MemUsed: 10177056 kB' 'SwapCached: 0 kB' 'Active: 5000500 kB' 'Inactive: 3372048 kB' 'Active(anon): 4842596 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8118732 kB' 'Mapped: 106516 kB' 'AnonPages: 256892 kB' 'Shmem: 4588780 kB' 'KernelStack: 8712 kB' 'PageTables: 4376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 123564 kB' 'Slab: 331400 kB' 'SReclaimable: 123564 kB' 'SUnreclaim: 207836 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 39774640 kB' 'MemUsed: 4401892 kB' 'SwapCached: 0 kB' 'Active: 1639348 kB' 'Inactive: 84212 kB' 'Active(anon): 1403668 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1455732 kB' 'Mapped: 92608 kB' 'AnonPages: 267864 kB' 'Shmem: 1135840 kB' 'KernelStack: 7416 kB' 'PageTables: 3816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 85456 kB' 'Slab: 207708 kB' 'SReclaimable: 85456 kB' 'SUnreclaim: 122252 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:55.882 node0=512 expecting 512 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:55.882 node1=1024 expecting 1024 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:55.882 00:04:55.882 real 0m3.848s 00:04:55.882 user 0m1.541s 00:04:55.882 sys 0m2.403s 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:55.882 20:18:48 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:55.882 ************************************ 00:04:55.882 END TEST custom_alloc 00:04:55.882 ************************************ 00:04:55.882 20:18:48 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:55.883 20:18:48 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:55.883 20:18:48 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:55.883 20:18:48 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:55.883 20:18:48 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:55.883 ************************************ 00:04:55.883 START TEST no_shrink_alloc 00:04:55.883 ************************************ 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.883 20:18:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:00.109 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:00.109 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:00.109 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:00.109 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:00.109 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78733736 kB' 'MemAvailable: 82034448 kB' 'Buffers: 12176 kB' 'Cached: 9562360 kB' 'SwapCached: 0 kB' 'Active: 6641588 kB' 'Inactive: 3456260 kB' 'Active(anon): 6248004 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525984 kB' 'Mapped: 198888 kB' 'Shmem: 5724692 kB' 'KReclaimable: 209020 kB' 'Slab: 539296 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330276 kB' 'KernelStack: 16112 kB' 'PageTables: 8128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7661628 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.109 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78733760 kB' 'MemAvailable: 82034472 kB' 'Buffers: 12176 kB' 'Cached: 9562364 kB' 'SwapCached: 0 kB' 'Active: 6641396 kB' 'Inactive: 3456260 kB' 'Active(anon): 6247812 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526364 kB' 'Mapped: 198812 kB' 'Shmem: 5724696 kB' 'KReclaimable: 209020 kB' 'Slab: 539216 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330196 kB' 'KernelStack: 16128 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7661648 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.110 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.111 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78733760 kB' 'MemAvailable: 82034472 kB' 'Buffers: 12176 kB' 'Cached: 9562364 kB' 'SwapCached: 0 kB' 'Active: 6641396 kB' 'Inactive: 3456260 kB' 'Active(anon): 6247812 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526364 kB' 'Mapped: 198812 kB' 'Shmem: 5724696 kB' 'KReclaimable: 209020 kB' 'Slab: 539216 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330196 kB' 'KernelStack: 16128 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7661668 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.112 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.113 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:00.114 nr_hugepages=1024 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:00.114 resv_hugepages=0 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:00.114 surplus_hugepages=0 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:00.114 anon_hugepages=0 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78740060 kB' 'MemAvailable: 82040772 kB' 'Buffers: 12176 kB' 'Cached: 9562420 kB' 'SwapCached: 0 kB' 'Active: 6641080 kB' 'Inactive: 3456260 kB' 'Active(anon): 6247496 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525960 kB' 'Mapped: 198812 kB' 'Shmem: 5724752 kB' 'KReclaimable: 209020 kB' 'Slab: 539216 kB' 'SReclaimable: 209020 kB' 'SUnreclaim: 330196 kB' 'KernelStack: 16112 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7661692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.114 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.115 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36891612 kB' 'MemUsed: 11225328 kB' 'SwapCached: 0 kB' 'Active: 5000796 kB' 'Inactive: 3372048 kB' 'Active(anon): 4842892 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8118732 kB' 'Mapped: 106356 kB' 'AnonPages: 257192 kB' 'Shmem: 4588780 kB' 'KernelStack: 8712 kB' 'PageTables: 4376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 123564 kB' 'Slab: 331380 kB' 'SReclaimable: 123564 kB' 'SUnreclaim: 207816 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.116 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:00.117 node0=1024 expecting 1024 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:00.117 20:18:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:03.408 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:03.408 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:03.408 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:03.408 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:03.408 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:03.408 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.697 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78733036 kB' 'MemAvailable: 82033732 kB' 'Buffers: 12176 kB' 'Cached: 9562492 kB' 'SwapCached: 0 kB' 'Active: 6642148 kB' 'Inactive: 3456260 kB' 'Active(anon): 6248564 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526972 kB' 'Mapped: 198852 kB' 'Shmem: 5724824 kB' 'KReclaimable: 208988 kB' 'Slab: 539460 kB' 'SReclaimable: 208988 kB' 'SUnreclaim: 330472 kB' 'KernelStack: 16128 kB' 'PageTables: 8180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7662128 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200808 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.698 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78732952 kB' 'MemAvailable: 82033648 kB' 'Buffers: 12176 kB' 'Cached: 9562492 kB' 'SwapCached: 0 kB' 'Active: 6641680 kB' 'Inactive: 3456260 kB' 'Active(anon): 6248096 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526512 kB' 'Mapped: 198816 kB' 'Shmem: 5724824 kB' 'KReclaimable: 208988 kB' 'Slab: 539480 kB' 'SReclaimable: 208988 kB' 'SUnreclaim: 330492 kB' 'KernelStack: 16096 kB' 'PageTables: 8128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7662144 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200792 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.699 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.700 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78733544 kB' 'MemAvailable: 82034240 kB' 'Buffers: 12176 kB' 'Cached: 9562512 kB' 'SwapCached: 0 kB' 'Active: 6642160 kB' 'Inactive: 3456260 kB' 'Active(anon): 6248576 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527004 kB' 'Mapped: 198816 kB' 'Shmem: 5724844 kB' 'KReclaimable: 208988 kB' 'Slab: 539480 kB' 'SReclaimable: 208988 kB' 'SUnreclaim: 330492 kB' 'KernelStack: 16112 kB' 'PageTables: 8176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7662168 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200792 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.701 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.702 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:03.703 nr_hugepages=1024 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:03.703 resv_hugepages=0 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:03.703 surplus_hugepages=0 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:03.703 anon_hugepages=0 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78737428 kB' 'MemAvailable: 82038124 kB' 'Buffers: 12176 kB' 'Cached: 9562552 kB' 'SwapCached: 0 kB' 'Active: 6641552 kB' 'Inactive: 3456260 kB' 'Active(anon): 6247968 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526300 kB' 'Mapped: 198816 kB' 'Shmem: 5724884 kB' 'KReclaimable: 208988 kB' 'Slab: 539464 kB' 'SReclaimable: 208988 kB' 'SUnreclaim: 330476 kB' 'KernelStack: 16096 kB' 'PageTables: 8128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7662188 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200792 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 986532 kB' 'DirectMap2M: 16515072 kB' 'DirectMap1G: 83886080 kB' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.703 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.704 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36894944 kB' 'MemUsed: 11221996 kB' 'SwapCached: 0 kB' 'Active: 5001024 kB' 'Inactive: 3372048 kB' 'Active(anon): 4843120 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8118740 kB' 'Mapped: 106360 kB' 'AnonPages: 257472 kB' 'Shmem: 4588788 kB' 'KernelStack: 8712 kB' 'PageTables: 4468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 123564 kB' 'Slab: 331620 kB' 'SReclaimable: 123564 kB' 'SUnreclaim: 208056 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.705 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:03.706 node0=1024 expecting 1024 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:03.706 00:05:03.706 real 0m7.836s 00:05:03.706 user 0m2.981s 00:05:03.706 sys 0m5.062s 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:03.706 20:18:56 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:03.706 ************************************ 00:05:03.706 END TEST no_shrink_alloc 00:05:03.706 ************************************ 00:05:03.706 20:18:56 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:03.706 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:05:03.706 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:03.706 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:03.706 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:03.706 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:03.706 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:03.706 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:03.965 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:03.965 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:03.965 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:03.965 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:03.965 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:03.965 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:03.965 20:18:56 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:03.965 00:05:03.965 real 0m30.809s 00:05:03.965 user 0m10.988s 00:05:03.965 sys 0m18.213s 00:05:03.965 20:18:56 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:03.965 20:18:56 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:03.965 ************************************ 00:05:03.965 END TEST hugepages 00:05:03.965 ************************************ 00:05:03.965 20:18:56 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:03.965 20:18:56 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:03.965 20:18:56 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:03.965 20:18:56 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:03.965 20:18:56 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:03.965 ************************************ 00:05:03.965 START TEST driver 00:05:03.965 ************************************ 00:05:03.965 20:18:56 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:03.965 * Looking for test storage... 00:05:03.965 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:03.965 20:18:56 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:03.965 20:18:56 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:03.965 20:18:56 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:09.239 20:19:01 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:09.239 20:19:01 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:09.239 20:19:01 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:09.239 20:19:01 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:09.239 ************************************ 00:05:09.239 START TEST guess_driver 00:05:09.239 ************************************ 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:09.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:09.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:09.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:09.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:09.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:09.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:09.239 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:09.239 Looking for driver=vfio-pci 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:09.239 20:19:01 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:13.434 20:19:04 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:13.434 20:19:04 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:13.434 20:19:04 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:04 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:13.434 20:19:04 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:13.434 20:19:04 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.434 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:13.435 20:19:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:15.966 20:19:07 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:15.966 20:19:07 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:15.966 20:19:07 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:15.966 20:19:07 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:15.966 20:19:07 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:15.966 20:19:07 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:15.966 20:19:07 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:21.240 00:05:21.240 real 0m11.631s 00:05:21.240 user 0m3.024s 00:05:21.240 sys 0m5.632s 00:05:21.240 20:19:13 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:21.240 20:19:13 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:21.240 ************************************ 00:05:21.240 END TEST guess_driver 00:05:21.240 ************************************ 00:05:21.240 20:19:13 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:05:21.240 00:05:21.240 real 0m16.902s 00:05:21.240 user 0m4.564s 00:05:21.240 sys 0m8.569s 00:05:21.240 20:19:13 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:21.240 20:19:13 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:21.240 ************************************ 00:05:21.240 END TEST driver 00:05:21.240 ************************************ 00:05:21.240 20:19:13 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:21.240 20:19:13 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:21.240 20:19:13 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:21.240 20:19:13 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:21.240 20:19:13 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:21.240 ************************************ 00:05:21.240 START TEST devices 00:05:21.240 ************************************ 00:05:21.240 20:19:13 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:21.240 * Looking for test storage... 00:05:21.240 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:21.240 20:19:13 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:21.240 20:19:13 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:21.240 20:19:13 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:21.240 20:19:13 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:25.432 20:19:17 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:25.432 20:19:17 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:25.432 No valid GPT data, bailing 00:05:25.432 20:19:17 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:25.432 20:19:17 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:25.432 20:19:17 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:25.432 20:19:17 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:25.432 20:19:17 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:25.432 20:19:17 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:25.432 20:19:17 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.432 20:19:17 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:25.432 ************************************ 00:05:25.432 START TEST nvme_mount 00:05:25.432 ************************************ 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:25.432 20:19:17 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:26.367 Creating new GPT entries in memory. 00:05:26.367 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:26.367 other utilities. 00:05:26.367 20:19:18 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:26.367 20:19:18 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:26.367 20:19:18 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:26.367 20:19:18 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:26.367 20:19:18 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:27.304 Creating new GPT entries in memory. 00:05:27.304 The operation has completed successfully. 00:05:27.304 20:19:19 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:27.304 20:19:19 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:27.304 20:19:19 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1290380 00:05:27.304 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.304 20:19:19 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:27.304 20:19:19 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.304 20:19:19 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:27.304 20:19:19 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:27.304 20:19:19 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:27.563 20:19:19 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:31.897 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:31.897 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:31.897 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:31.897 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:31.897 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:31.897 20:19:23 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:31.898 20:19:23 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:35.179 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.437 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:35.438 20:19:27 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:35.438 20:19:27 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:35.438 20:19:27 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:39.634 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:39.634 00:05:39.634 real 0m13.938s 00:05:39.634 user 0m4.231s 00:05:39.634 sys 0m7.713s 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:39.634 20:19:31 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:39.634 ************************************ 00:05:39.634 END TEST nvme_mount 00:05:39.634 ************************************ 00:05:39.634 20:19:31 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:39.634 20:19:31 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:39.634 20:19:31 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:39.634 20:19:31 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:39.634 20:19:31 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:39.634 ************************************ 00:05:39.634 START TEST dm_mount 00:05:39.634 ************************************ 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:39.634 20:19:31 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:40.573 Creating new GPT entries in memory. 00:05:40.573 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:40.573 other utilities. 00:05:40.573 20:19:32 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:40.573 20:19:32 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:40.573 20:19:32 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:40.573 20:19:32 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:40.573 20:19:32 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:41.509 Creating new GPT entries in memory. 00:05:41.509 The operation has completed successfully. 00:05:41.509 20:19:33 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:41.509 20:19:33 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:41.509 20:19:33 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:41.509 20:19:33 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:41.509 20:19:33 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:42.449 The operation has completed successfully. 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1294676 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:42.449 20:19:34 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:46.640 20:19:38 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:49.931 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:50.191 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:50.191 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:50.191 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:50.191 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:50.191 20:19:42 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:50.191 00:05:50.191 real 0m10.766s 00:05:50.191 user 0m2.785s 00:05:50.191 sys 0m5.114s 00:05:50.191 20:19:42 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.191 20:19:42 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:50.191 ************************************ 00:05:50.191 END TEST dm_mount 00:05:50.191 ************************************ 00:05:50.191 20:19:42 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:50.191 20:19:42 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:50.191 20:19:42 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:50.191 20:19:42 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:50.191 20:19:42 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:50.191 20:19:42 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:50.191 20:19:42 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:50.191 20:19:42 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:50.451 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:50.451 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:50.451 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:50.451 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:50.451 20:19:42 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:50.451 20:19:42 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:50.451 20:19:42 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:50.451 20:19:42 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:50.451 20:19:42 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:50.451 20:19:42 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:50.451 20:19:42 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:50.451 00:05:50.451 real 0m29.541s 00:05:50.451 user 0m8.593s 00:05:50.451 sys 0m16.024s 00:05:50.451 20:19:42 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.451 20:19:42 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:50.451 ************************************ 00:05:50.451 END TEST devices 00:05:50.451 ************************************ 00:05:50.451 20:19:42 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:50.451 00:05:50.451 real 1m45.943s 00:05:50.451 user 0m33.211s 00:05:50.451 sys 0m59.600s 00:05:50.451 20:19:42 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:50.451 20:19:42 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:50.451 ************************************ 00:05:50.451 END TEST setup.sh 00:05:50.451 ************************************ 00:05:50.451 20:19:42 -- common/autotest_common.sh@1142 -- # return 0 00:05:50.451 20:19:42 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:54.645 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:54.645 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:54.645 Hugepages 00:05:54.645 node hugesize free / total 00:05:54.645 node0 1048576kB 0 / 0 00:05:54.645 node0 2048kB 1024 / 1024 00:05:54.645 node1 1048576kB 0 / 0 00:05:54.645 node1 2048kB 1024 / 1024 00:05:54.645 00:05:54.645 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:54.645 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:54.645 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:54.645 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:54.645 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:54.645 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:54.645 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:54.645 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:54.645 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:54.645 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:05:54.645 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:54.645 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:54.645 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:54.645 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:54.645 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:54.645 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:54.645 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:54.645 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:54.645 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:05:54.645 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:05:54.645 20:19:46 -- spdk/autotest.sh@130 -- # uname -s 00:05:54.645 20:19:46 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:54.645 20:19:46 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:54.645 20:19:46 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:57.980 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:57.980 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:57.980 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:57.980 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:58.239 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:58.239 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:58.239 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:58.239 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:58.239 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:58.239 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:58.239 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:58.239 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:58.239 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:58.239 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:58.498 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:58.498 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:58.498 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:58.498 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:01.036 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:01.036 20:19:53 -- common/autotest_common.sh@1532 -- # sleep 1 00:06:01.969 20:19:54 -- common/autotest_common.sh@1533 -- # bdfs=() 00:06:01.969 20:19:54 -- common/autotest_common.sh@1533 -- # local bdfs 00:06:01.969 20:19:54 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:06:01.969 20:19:54 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:06:01.969 20:19:54 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:01.969 20:19:54 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:01.969 20:19:54 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:01.969 20:19:54 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:01.969 20:19:54 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:01.969 20:19:54 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:01.969 20:19:54 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:01.969 20:19:54 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:06.158 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:06.158 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:06.158 Waiting for block devices as requested 00:06:06.158 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:06:06.158 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:06.158 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:06.158 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:06.158 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:06.158 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:06.418 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:06.418 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:06.418 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:06.679 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:06.679 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:06.679 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:06.938 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:06.938 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:06.938 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:07.198 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:07.198 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:07.198 20:19:59 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:06:07.198 20:19:59 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:06:07.198 20:19:59 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:06:07.198 20:19:59 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:06:07.198 20:19:59 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:07.198 20:19:59 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:06:07.198 20:19:59 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:07.198 20:19:59 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:06:07.198 20:19:59 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:06:07.198 20:19:59 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:06:07.198 20:19:59 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:06:07.198 20:19:59 -- common/autotest_common.sh@1545 -- # grep oacs 00:06:07.198 20:19:59 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:06:07.198 20:19:59 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:06:07.198 20:19:59 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:06:07.198 20:19:59 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:06:07.198 20:19:59 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:06:07.198 20:19:59 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:06:07.198 20:19:59 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:06:07.198 20:19:59 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:06:07.198 20:19:59 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:06:07.198 20:19:59 -- common/autotest_common.sh@1557 -- # continue 00:06:07.198 20:19:59 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:06:07.198 20:19:59 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:07.198 20:19:59 -- common/autotest_common.sh@10 -- # set +x 00:06:07.457 20:19:59 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:06:07.457 20:19:59 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:07.457 20:19:59 -- common/autotest_common.sh@10 -- # set +x 00:06:07.458 20:19:59 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:10.752 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:10.752 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:10.752 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:11.011 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:11.011 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:11.011 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:11.011 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:11.011 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:11.011 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:11.011 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:11.011 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:11.011 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:11.011 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:11.270 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:11.270 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:11.270 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:11.270 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:11.270 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:13.806 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:13.806 20:20:05 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:06:13.806 20:20:05 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:13.806 20:20:05 -- common/autotest_common.sh@10 -- # set +x 00:06:13.806 20:20:05 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:06:13.806 20:20:05 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:06:13.806 20:20:05 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:06:13.806 20:20:05 -- common/autotest_common.sh@1577 -- # bdfs=() 00:06:13.806 20:20:05 -- common/autotest_common.sh@1577 -- # local bdfs 00:06:13.806 20:20:05 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:06:13.806 20:20:05 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:13.806 20:20:05 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:13.806 20:20:05 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:13.806 20:20:05 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:13.806 20:20:05 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:13.806 20:20:05 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:13.806 20:20:05 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:13.806 20:20:05 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:06:13.806 20:20:05 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:06:13.806 20:20:05 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:06:13.806 20:20:05 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:06:13.806 20:20:05 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:06:13.806 20:20:05 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:06:13.806 20:20:05 -- common/autotest_common.sh@1593 -- # return 0 00:06:13.806 20:20:05 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:13.806 20:20:05 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:13.806 20:20:05 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:13.806 20:20:05 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:13.806 20:20:05 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:14.375 Restarting all devices. 00:06:18.599 lstat() error: No such file or directory 00:06:18.599 QAT Error: No GENERAL section found 00:06:18.599 Failed to configure qat_dev0 00:06:18.599 lstat() error: No such file or directory 00:06:18.599 QAT Error: No GENERAL section found 00:06:18.599 Failed to configure qat_dev1 00:06:18.599 lstat() error: No such file or directory 00:06:18.599 QAT Error: No GENERAL section found 00:06:18.599 Failed to configure qat_dev2 00:06:18.599 enable sriov 00:06:18.599 Checking status of all devices. 00:06:18.599 There is 3 QAT acceleration device(s) in the system: 00:06:18.599 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:06:18.599 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:06:18.599 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:06:19.551 0000:3d:00.0 set to 16 VFs 00:06:20.930 0000:3f:00.0 set to 16 VFs 00:06:23.462 0000:da:00.0 set to 16 VFs 00:06:26.741 Properly configured the qat device with driver uio_pci_generic. 00:06:26.741 20:20:18 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:26.741 20:20:18 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:26.741 20:20:18 -- common/autotest_common.sh@10 -- # set +x 00:06:26.741 20:20:18 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:26.741 20:20:18 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:26.741 20:20:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:26.741 20:20:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.741 20:20:18 -- common/autotest_common.sh@10 -- # set +x 00:06:26.741 ************************************ 00:06:26.741 START TEST env 00:06:26.741 ************************************ 00:06:26.741 20:20:18 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:26.741 * Looking for test storage... 00:06:26.741 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:26.741 20:20:18 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:26.741 20:20:18 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:26.741 20:20:18 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.741 20:20:18 env -- common/autotest_common.sh@10 -- # set +x 00:06:26.741 ************************************ 00:06:26.741 START TEST env_memory 00:06:26.741 ************************************ 00:06:26.741 20:20:18 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:26.741 00:06:26.741 00:06:26.741 CUnit - A unit testing framework for C - Version 2.1-3 00:06:26.741 http://cunit.sourceforge.net/ 00:06:26.741 00:06:26.741 00:06:26.741 Suite: memory 00:06:26.741 Test: alloc and free memory map ...[2024-07-15 20:20:18.798802] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:26.741 passed 00:06:26.741 Test: mem map translation ...[2024-07-15 20:20:18.835931] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:26.741 [2024-07-15 20:20:18.835959] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:26.742 [2024-07-15 20:20:18.836030] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:26.742 [2024-07-15 20:20:18.836047] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:26.742 passed 00:06:26.742 Test: mem map registration ...[2024-07-15 20:20:18.909832] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:26.742 [2024-07-15 20:20:18.909862] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:26.742 passed 00:06:26.742 Test: mem map adjacent registrations ...passed 00:06:26.742 00:06:26.742 Run Summary: Type Total Ran Passed Failed Inactive 00:06:26.742 suites 1 1 n/a 0 0 00:06:26.742 tests 4 4 4 0 0 00:06:26.742 asserts 152 152 152 0 n/a 00:06:26.742 00:06:26.742 Elapsed time = 0.248 seconds 00:06:26.742 00:06:26.742 real 0m0.259s 00:06:26.742 user 0m0.248s 00:06:26.742 sys 0m0.010s 00:06:26.742 20:20:19 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:26.742 20:20:19 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:26.742 ************************************ 00:06:26.742 END TEST env_memory 00:06:26.742 ************************************ 00:06:26.742 20:20:19 env -- common/autotest_common.sh@1142 -- # return 0 00:06:26.742 20:20:19 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:26.742 20:20:19 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:26.742 20:20:19 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:26.742 20:20:19 env -- common/autotest_common.sh@10 -- # set +x 00:06:26.742 ************************************ 00:06:26.742 START TEST env_vtophys 00:06:26.742 ************************************ 00:06:26.742 20:20:19 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:26.742 EAL: lib.eal log level changed from notice to debug 00:06:26.742 EAL: Detected lcore 0 as core 0 on socket 0 00:06:26.742 EAL: Detected lcore 1 as core 1 on socket 0 00:06:26.742 EAL: Detected lcore 2 as core 2 on socket 0 00:06:26.742 EAL: Detected lcore 3 as core 3 on socket 0 00:06:26.742 EAL: Detected lcore 4 as core 4 on socket 0 00:06:26.742 EAL: Detected lcore 5 as core 8 on socket 0 00:06:26.742 EAL: Detected lcore 6 as core 9 on socket 0 00:06:26.742 EAL: Detected lcore 7 as core 10 on socket 0 00:06:26.742 EAL: Detected lcore 8 as core 11 on socket 0 00:06:26.742 EAL: Detected lcore 9 as core 16 on socket 0 00:06:26.742 EAL: Detected lcore 10 as core 17 on socket 0 00:06:26.742 EAL: Detected lcore 11 as core 18 on socket 0 00:06:26.742 EAL: Detected lcore 12 as core 19 on socket 0 00:06:26.742 EAL: Detected lcore 13 as core 20 on socket 0 00:06:26.742 EAL: Detected lcore 14 as core 24 on socket 0 00:06:26.742 EAL: Detected lcore 15 as core 25 on socket 0 00:06:26.742 EAL: Detected lcore 16 as core 26 on socket 0 00:06:26.742 EAL: Detected lcore 17 as core 27 on socket 0 00:06:26.742 EAL: Detected lcore 18 as core 0 on socket 1 00:06:26.742 EAL: Detected lcore 19 as core 1 on socket 1 00:06:26.742 EAL: Detected lcore 20 as core 2 on socket 1 00:06:26.742 EAL: Detected lcore 21 as core 3 on socket 1 00:06:26.742 EAL: Detected lcore 22 as core 4 on socket 1 00:06:26.742 EAL: Detected lcore 23 as core 8 on socket 1 00:06:26.742 EAL: Detected lcore 24 as core 9 on socket 1 00:06:26.742 EAL: Detected lcore 25 as core 10 on socket 1 00:06:26.742 EAL: Detected lcore 26 as core 11 on socket 1 00:06:26.742 EAL: Detected lcore 27 as core 16 on socket 1 00:06:26.742 EAL: Detected lcore 28 as core 17 on socket 1 00:06:26.742 EAL: Detected lcore 29 as core 18 on socket 1 00:06:26.742 EAL: Detected lcore 30 as core 19 on socket 1 00:06:26.742 EAL: Detected lcore 31 as core 20 on socket 1 00:06:26.742 EAL: Detected lcore 32 as core 24 on socket 1 00:06:26.742 EAL: Detected lcore 33 as core 25 on socket 1 00:06:26.742 EAL: Detected lcore 34 as core 26 on socket 1 00:06:26.742 EAL: Detected lcore 35 as core 27 on socket 1 00:06:26.742 EAL: Detected lcore 36 as core 0 on socket 0 00:06:26.742 EAL: Detected lcore 37 as core 1 on socket 0 00:06:26.742 EAL: Detected lcore 38 as core 2 on socket 0 00:06:26.742 EAL: Detected lcore 39 as core 3 on socket 0 00:06:26.742 EAL: Detected lcore 40 as core 4 on socket 0 00:06:26.742 EAL: Detected lcore 41 as core 8 on socket 0 00:06:26.742 EAL: Detected lcore 42 as core 9 on socket 0 00:06:26.742 EAL: Detected lcore 43 as core 10 on socket 0 00:06:26.742 EAL: Detected lcore 44 as core 11 on socket 0 00:06:26.742 EAL: Detected lcore 45 as core 16 on socket 0 00:06:26.742 EAL: Detected lcore 46 as core 17 on socket 0 00:06:26.742 EAL: Detected lcore 47 as core 18 on socket 0 00:06:26.742 EAL: Detected lcore 48 as core 19 on socket 0 00:06:26.742 EAL: Detected lcore 49 as core 20 on socket 0 00:06:26.742 EAL: Detected lcore 50 as core 24 on socket 0 00:06:26.742 EAL: Detected lcore 51 as core 25 on socket 0 00:06:26.742 EAL: Detected lcore 52 as core 26 on socket 0 00:06:26.742 EAL: Detected lcore 53 as core 27 on socket 0 00:06:26.742 EAL: Detected lcore 54 as core 0 on socket 1 00:06:26.742 EAL: Detected lcore 55 as core 1 on socket 1 00:06:26.742 EAL: Detected lcore 56 as core 2 on socket 1 00:06:26.742 EAL: Detected lcore 57 as core 3 on socket 1 00:06:26.742 EAL: Detected lcore 58 as core 4 on socket 1 00:06:26.742 EAL: Detected lcore 59 as core 8 on socket 1 00:06:26.742 EAL: Detected lcore 60 as core 9 on socket 1 00:06:26.742 EAL: Detected lcore 61 as core 10 on socket 1 00:06:26.742 EAL: Detected lcore 62 as core 11 on socket 1 00:06:26.742 EAL: Detected lcore 63 as core 16 on socket 1 00:06:27.003 EAL: Detected lcore 64 as core 17 on socket 1 00:06:27.003 EAL: Detected lcore 65 as core 18 on socket 1 00:06:27.003 EAL: Detected lcore 66 as core 19 on socket 1 00:06:27.003 EAL: Detected lcore 67 as core 20 on socket 1 00:06:27.003 EAL: Detected lcore 68 as core 24 on socket 1 00:06:27.003 EAL: Detected lcore 69 as core 25 on socket 1 00:06:27.003 EAL: Detected lcore 70 as core 26 on socket 1 00:06:27.003 EAL: Detected lcore 71 as core 27 on socket 1 00:06:27.003 EAL: Maximum logical cores by configuration: 128 00:06:27.003 EAL: Detected CPU lcores: 72 00:06:27.003 EAL: Detected NUMA nodes: 2 00:06:27.003 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:27.003 EAL: Detected shared linkage of DPDK 00:06:27.003 EAL: No shared files mode enabled, IPC will be disabled 00:06:27.003 EAL: No shared files mode enabled, IPC is disabled 00:06:27.003 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:06:27.003 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:06:27.003 EAL: Bus pci wants IOVA as 'PA' 00:06:27.003 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:27.003 EAL: Bus vdev wants IOVA as 'DC' 00:06:27.003 EAL: Selected IOVA mode 'PA' 00:06:27.003 EAL: Probing VFIO support... 00:06:27.003 EAL: IOMMU type 1 (Type 1) is supported 00:06:27.003 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:27.003 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:27.003 EAL: VFIO support initialized 00:06:27.003 EAL: Ask a virtual area of 0x2e000 bytes 00:06:27.003 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:27.003 EAL: Setting up physically contiguous memory... 00:06:27.003 EAL: Setting maximum number of open files to 524288 00:06:27.003 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:27.003 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:27.003 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:27.003 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.003 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:27.003 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:27.003 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.003 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:27.003 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:27.003 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.003 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:27.003 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:27.003 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.003 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:27.003 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:27.003 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.003 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:27.003 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:27.003 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.003 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:27.003 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:27.003 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.003 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:27.003 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:27.003 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.003 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:27.003 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:27.003 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:27.003 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.003 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:27.003 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:27.003 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.003 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:27.003 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:27.003 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.003 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:27.003 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:27.003 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.003 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:27.003 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:27.003 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.003 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:27.003 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:27.003 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.003 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:27.004 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:27.004 EAL: Ask a virtual area of 0x61000 bytes 00:06:27.004 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:27.004 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:27.004 EAL: Ask a virtual area of 0x400000000 bytes 00:06:27.004 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:27.004 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:27.004 EAL: Hugepages will be freed exactly as allocated. 00:06:27.004 EAL: No shared files mode enabled, IPC is disabled 00:06:27.004 EAL: No shared files mode enabled, IPC is disabled 00:06:27.004 EAL: TSC frequency is ~2300000 KHz 00:06:27.004 EAL: Main lcore 0 is ready (tid=7f2c0d478b00;cpuset=[0]) 00:06:27.004 EAL: Trying to obtain current memory policy. 00:06:27.004 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.004 EAL: Restoring previous memory policy: 0 00:06:27.004 EAL: request: mp_malloc_sync 00:06:27.004 EAL: No shared files mode enabled, IPC is disabled 00:06:27.004 EAL: Heap on socket 0 was expanded by 2MB 00:06:27.004 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001000000 00:06:27.004 EAL: PCI memory mapped at 0x202001001000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001002000 00:06:27.004 EAL: PCI memory mapped at 0x202001003000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001004000 00:06:27.004 EAL: PCI memory mapped at 0x202001005000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001006000 00:06:27.004 EAL: PCI memory mapped at 0x202001007000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001008000 00:06:27.004 EAL: PCI memory mapped at 0x202001009000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200100a000 00:06:27.004 EAL: PCI memory mapped at 0x20200100b000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200100c000 00:06:27.004 EAL: PCI memory mapped at 0x20200100d000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200100e000 00:06:27.004 EAL: PCI memory mapped at 0x20200100f000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001010000 00:06:27.004 EAL: PCI memory mapped at 0x202001011000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001012000 00:06:27.004 EAL: PCI memory mapped at 0x202001013000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001014000 00:06:27.004 EAL: PCI memory mapped at 0x202001015000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001016000 00:06:27.004 EAL: PCI memory mapped at 0x202001017000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001018000 00:06:27.004 EAL: PCI memory mapped at 0x202001019000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200101a000 00:06:27.004 EAL: PCI memory mapped at 0x20200101b000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200101c000 00:06:27.004 EAL: PCI memory mapped at 0x20200101d000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:27.004 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200101e000 00:06:27.004 EAL: PCI memory mapped at 0x20200101f000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001020000 00:06:27.004 EAL: PCI memory mapped at 0x202001021000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001022000 00:06:27.004 EAL: PCI memory mapped at 0x202001023000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001024000 00:06:27.004 EAL: PCI memory mapped at 0x202001025000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001026000 00:06:27.004 EAL: PCI memory mapped at 0x202001027000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001028000 00:06:27.004 EAL: PCI memory mapped at 0x202001029000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200102a000 00:06:27.004 EAL: PCI memory mapped at 0x20200102b000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200102c000 00:06:27.004 EAL: PCI memory mapped at 0x20200102d000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200102e000 00:06:27.004 EAL: PCI memory mapped at 0x20200102f000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001030000 00:06:27.004 EAL: PCI memory mapped at 0x202001031000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001032000 00:06:27.004 EAL: PCI memory mapped at 0x202001033000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001034000 00:06:27.004 EAL: PCI memory mapped at 0x202001035000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001036000 00:06:27.004 EAL: PCI memory mapped at 0x202001037000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001038000 00:06:27.004 EAL: PCI memory mapped at 0x202001039000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200103a000 00:06:27.004 EAL: PCI memory mapped at 0x20200103b000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200103c000 00:06:27.004 EAL: PCI memory mapped at 0x20200103d000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:27.004 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x20200103e000 00:06:27.004 EAL: PCI memory mapped at 0x20200103f000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:27.004 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001040000 00:06:27.004 EAL: PCI memory mapped at 0x202001041000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:27.004 EAL: Trying to obtain current memory policy. 00:06:27.004 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:27.004 EAL: Restoring previous memory policy: 4 00:06:27.004 EAL: request: mp_malloc_sync 00:06:27.004 EAL: No shared files mode enabled, IPC is disabled 00:06:27.004 EAL: Heap on socket 1 was expanded by 2MB 00:06:27.004 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001042000 00:06:27.004 EAL: PCI memory mapped at 0x202001043000 00:06:27.004 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:27.004 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:06:27.004 EAL: probe driver: 8086:37c9 qat 00:06:27.004 EAL: PCI memory mapped at 0x202001044000 00:06:27.004 EAL: PCI memory mapped at 0x202001045000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x202001046000 00:06:27.005 EAL: PCI memory mapped at 0x202001047000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x202001048000 00:06:27.005 EAL: PCI memory mapped at 0x202001049000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x20200104a000 00:06:27.005 EAL: PCI memory mapped at 0x20200104b000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x20200104c000 00:06:27.005 EAL: PCI memory mapped at 0x20200104d000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x20200104e000 00:06:27.005 EAL: PCI memory mapped at 0x20200104f000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x202001050000 00:06:27.005 EAL: PCI memory mapped at 0x202001051000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x202001052000 00:06:27.005 EAL: PCI memory mapped at 0x202001053000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x202001054000 00:06:27.005 EAL: PCI memory mapped at 0x202001055000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x202001056000 00:06:27.005 EAL: PCI memory mapped at 0x202001057000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x202001058000 00:06:27.005 EAL: PCI memory mapped at 0x202001059000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x20200105a000 00:06:27.005 EAL: PCI memory mapped at 0x20200105b000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x20200105c000 00:06:27.005 EAL: PCI memory mapped at 0x20200105d000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:27.005 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:06:27.005 EAL: probe driver: 8086:37c9 qat 00:06:27.005 EAL: PCI memory mapped at 0x20200105e000 00:06:27.005 EAL: PCI memory mapped at 0x20200105f000 00:06:27.005 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:27.005 EAL: Mem event callback 'spdk:(nil)' registered 00:06:27.005 00:06:27.005 00:06:27.005 CUnit - A unit testing framework for C - Version 2.1-3 00:06:27.005 http://cunit.sourceforge.net/ 00:06:27.005 00:06:27.005 00:06:27.005 Suite: components_suite 00:06:27.005 Test: vtophys_malloc_test ...passed 00:06:27.005 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:27.005 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.005 EAL: Restoring previous memory policy: 4 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was expanded by 4MB 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was shrunk by 4MB 00:06:27.005 EAL: Trying to obtain current memory policy. 00:06:27.005 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.005 EAL: Restoring previous memory policy: 4 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was expanded by 6MB 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was shrunk by 6MB 00:06:27.005 EAL: Trying to obtain current memory policy. 00:06:27.005 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.005 EAL: Restoring previous memory policy: 4 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was expanded by 10MB 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was shrunk by 10MB 00:06:27.005 EAL: Trying to obtain current memory policy. 00:06:27.005 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.005 EAL: Restoring previous memory policy: 4 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was expanded by 18MB 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was shrunk by 18MB 00:06:27.005 EAL: Trying to obtain current memory policy. 00:06:27.005 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.005 EAL: Restoring previous memory policy: 4 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was expanded by 34MB 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was shrunk by 34MB 00:06:27.005 EAL: Trying to obtain current memory policy. 00:06:27.005 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.005 EAL: Restoring previous memory policy: 4 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was expanded by 66MB 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was shrunk by 66MB 00:06:27.005 EAL: Trying to obtain current memory policy. 00:06:27.005 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.005 EAL: Restoring previous memory policy: 4 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.005 EAL: request: mp_malloc_sync 00:06:27.005 EAL: No shared files mode enabled, IPC is disabled 00:06:27.005 EAL: Heap on socket 0 was expanded by 130MB 00:06:27.005 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.265 EAL: request: mp_malloc_sync 00:06:27.265 EAL: No shared files mode enabled, IPC is disabled 00:06:27.265 EAL: Heap on socket 0 was shrunk by 130MB 00:06:27.265 EAL: Trying to obtain current memory policy. 00:06:27.265 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.265 EAL: Restoring previous memory policy: 4 00:06:27.265 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.265 EAL: request: mp_malloc_sync 00:06:27.265 EAL: No shared files mode enabled, IPC is disabled 00:06:27.265 EAL: Heap on socket 0 was expanded by 258MB 00:06:27.265 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.265 EAL: request: mp_malloc_sync 00:06:27.265 EAL: No shared files mode enabled, IPC is disabled 00:06:27.265 EAL: Heap on socket 0 was shrunk by 258MB 00:06:27.265 EAL: Trying to obtain current memory policy. 00:06:27.265 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.524 EAL: Restoring previous memory policy: 4 00:06:27.524 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.524 EAL: request: mp_malloc_sync 00:06:27.524 EAL: No shared files mode enabled, IPC is disabled 00:06:27.524 EAL: Heap on socket 0 was expanded by 514MB 00:06:27.524 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.524 EAL: request: mp_malloc_sync 00:06:27.524 EAL: No shared files mode enabled, IPC is disabled 00:06:27.524 EAL: Heap on socket 0 was shrunk by 514MB 00:06:27.524 EAL: Trying to obtain current memory policy. 00:06:27.524 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:27.784 EAL: Restoring previous memory policy: 4 00:06:27.784 EAL: Calling mem event callback 'spdk:(nil)' 00:06:27.784 EAL: request: mp_malloc_sync 00:06:27.784 EAL: No shared files mode enabled, IPC is disabled 00:06:27.784 EAL: Heap on socket 0 was expanded by 1026MB 00:06:28.044 EAL: Calling mem event callback 'spdk:(nil)' 00:06:28.304 EAL: request: mp_malloc_sync 00:06:28.304 EAL: No shared files mode enabled, IPC is disabled 00:06:28.304 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:28.304 passed 00:06:28.304 00:06:28.304 Run Summary: Type Total Ran Passed Failed Inactive 00:06:28.304 suites 1 1 n/a 0 0 00:06:28.304 tests 2 2 2 0 0 00:06:28.304 asserts 5701 5701 5701 0 n/a 00:06:28.304 00:06:28.304 Elapsed time = 1.167 seconds 00:06:28.304 EAL: No shared files mode enabled, IPC is disabled 00:06:28.304 EAL: No shared files mode enabled, IPC is disabled 00:06:28.304 EAL: No shared files mode enabled, IPC is disabled 00:06:28.304 00:06:28.304 real 0m1.371s 00:06:28.304 user 0m0.759s 00:06:28.304 sys 0m0.577s 00:06:28.304 20:20:20 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:28.304 20:20:20 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:28.304 ************************************ 00:06:28.304 END TEST env_vtophys 00:06:28.304 ************************************ 00:06:28.304 20:20:20 env -- common/autotest_common.sh@1142 -- # return 0 00:06:28.304 20:20:20 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:28.304 20:20:20 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:28.304 20:20:20 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:28.304 20:20:20 env -- common/autotest_common.sh@10 -- # set +x 00:06:28.304 ************************************ 00:06:28.304 START TEST env_pci 00:06:28.304 ************************************ 00:06:28.304 20:20:20 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:28.304 00:06:28.304 00:06:28.304 CUnit - A unit testing framework for C - Version 2.1-3 00:06:28.304 http://cunit.sourceforge.net/ 00:06:28.304 00:06:28.304 00:06:28.304 Suite: pci 00:06:28.304 Test: pci_hook ...[2024-07-15 20:20:20.566282] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1305988 has claimed it 00:06:28.304 EAL: Cannot find device (10000:00:01.0) 00:06:28.304 EAL: Failed to attach device on primary process 00:06:28.304 passed 00:06:28.304 00:06:28.304 Run Summary: Type Total Ran Passed Failed Inactive 00:06:28.304 suites 1 1 n/a 0 0 00:06:28.304 tests 1 1 1 0 0 00:06:28.304 asserts 25 25 25 0 n/a 00:06:28.304 00:06:28.304 Elapsed time = 0.042 seconds 00:06:28.304 00:06:28.304 real 0m0.070s 00:06:28.304 user 0m0.019s 00:06:28.304 sys 0m0.051s 00:06:28.304 20:20:20 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:28.304 20:20:20 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:28.304 ************************************ 00:06:28.304 END TEST env_pci 00:06:28.304 ************************************ 00:06:28.304 20:20:20 env -- common/autotest_common.sh@1142 -- # return 0 00:06:28.304 20:20:20 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:28.304 20:20:20 env -- env/env.sh@15 -- # uname 00:06:28.304 20:20:20 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:28.304 20:20:20 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:28.304 20:20:20 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:28.304 20:20:20 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:06:28.304 20:20:20 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:28.304 20:20:20 env -- common/autotest_common.sh@10 -- # set +x 00:06:28.565 ************************************ 00:06:28.565 START TEST env_dpdk_post_init 00:06:28.565 ************************************ 00:06:28.565 20:20:20 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:28.565 EAL: Detected CPU lcores: 72 00:06:28.565 EAL: Detected NUMA nodes: 2 00:06:28.565 EAL: Detected shared linkage of DPDK 00:06:28.565 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:28.565 EAL: Selected IOVA mode 'PA' 00:06:28.565 EAL: VFIO support initialized 00:06:28.565 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:28.565 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:28.565 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.565 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:28.565 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.565 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:28.565 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:28.565 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.565 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:28.565 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.565 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:28.565 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:28.565 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.565 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:28.565 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.565 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:28.565 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:28.565 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.565 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:28.565 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.565 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:28.565 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:28.565 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.565 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:28.565 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.565 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.566 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:28.566 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.566 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:28.567 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:28.567 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:28.567 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:28.826 EAL: Using IOMMU type 1 (Type 1) 00:06:28.826 EAL: Ignore mapping IO port bar(1) 00:06:28.826 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:06:28.826 EAL: Ignore mapping IO port bar(1) 00:06:28.826 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:06:28.826 EAL: Ignore mapping IO port bar(1) 00:06:28.826 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:06:28.826 EAL: Ignore mapping IO port bar(1) 00:06:28.826 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:06:28.826 EAL: Ignore mapping IO port bar(1) 00:06:28.826 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:06:28.826 EAL: Ignore mapping IO port bar(1) 00:06:28.826 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:06:28.826 EAL: Ignore mapping IO port bar(1) 00:06:28.826 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:06:28.826 EAL: Ignore mapping IO port bar(1) 00:06:28.826 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:06:29.085 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:06:29.085 EAL: Ignore mapping IO port bar(1) 00:06:29.085 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:06:29.085 EAL: Ignore mapping IO port bar(1) 00:06:29.085 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:06:29.085 EAL: Ignore mapping IO port bar(1) 00:06:29.085 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:06:29.085 EAL: Ignore mapping IO port bar(1) 00:06:29.085 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:06:29.085 EAL: Ignore mapping IO port bar(1) 00:06:29.085 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:06:29.085 EAL: Ignore mapping IO port bar(1) 00:06:29.085 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:06:29.085 EAL: Ignore mapping IO port bar(1) 00:06:29.085 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:06:29.085 EAL: Ignore mapping IO port bar(1) 00:06:29.085 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:06:29.085 EAL: Ignore mapping IO port bar(1) 00:06:29.085 EAL: Ignore mapping IO port bar(5) 00:06:29.085 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:06:29.085 EAL: Ignore mapping IO port bar(1) 00:06:29.085 EAL: Ignore mapping IO port bar(5) 00:06:29.085 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:06:32.378 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:06:32.378 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:06:32.378 Starting DPDK initialization... 00:06:32.378 Starting SPDK post initialization... 00:06:32.378 SPDK NVMe probe 00:06:32.378 Attaching to 0000:5e:00.0 00:06:32.378 Attached to 0000:5e:00.0 00:06:32.378 Cleaning up... 00:06:32.378 00:06:32.378 real 0m3.538s 00:06:32.378 user 0m2.431s 00:06:32.378 sys 0m0.668s 00:06:32.378 20:20:24 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:32.378 20:20:24 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:32.378 ************************************ 00:06:32.378 END TEST env_dpdk_post_init 00:06:32.378 ************************************ 00:06:32.378 20:20:24 env -- common/autotest_common.sh@1142 -- # return 0 00:06:32.378 20:20:24 env -- env/env.sh@26 -- # uname 00:06:32.378 20:20:24 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:32.378 20:20:24 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:32.378 20:20:24 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:32.378 20:20:24 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.378 20:20:24 env -- common/autotest_common.sh@10 -- # set +x 00:06:32.379 ************************************ 00:06:32.379 START TEST env_mem_callbacks 00:06:32.379 ************************************ 00:06:32.379 20:20:24 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:32.379 EAL: Detected CPU lcores: 72 00:06:32.379 EAL: Detected NUMA nodes: 2 00:06:32.379 EAL: Detected shared linkage of DPDK 00:06:32.379 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:32.379 EAL: Selected IOVA mode 'PA' 00:06:32.379 EAL: VFIO support initialized 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.379 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:32.379 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.379 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:32.380 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:32.380 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:32.380 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:32.380 00:06:32.380 00:06:32.380 CUnit - A unit testing framework for C - Version 2.1-3 00:06:32.380 http://cunit.sourceforge.net/ 00:06:32.380 00:06:32.380 00:06:32.380 Suite: memory 00:06:32.380 Test: test ... 00:06:32.380 register 0x200000200000 2097152 00:06:32.380 register 0x201000a00000 2097152 00:06:32.380 malloc 3145728 00:06:32.380 register 0x200000400000 4194304 00:06:32.380 buf 0x200000500000 len 3145728 PASSED 00:06:32.380 malloc 64 00:06:32.380 buf 0x2000004fff40 len 64 PASSED 00:06:32.380 malloc 4194304 00:06:32.380 register 0x200000800000 6291456 00:06:32.380 buf 0x200000a00000 len 4194304 PASSED 00:06:32.380 free 0x200000500000 3145728 00:06:32.380 free 0x2000004fff40 64 00:06:32.380 unregister 0x200000400000 4194304 PASSED 00:06:32.380 free 0x200000a00000 4194304 00:06:32.380 unregister 0x200000800000 6291456 PASSED 00:06:32.380 malloc 8388608 00:06:32.380 register 0x200000400000 10485760 00:06:32.380 buf 0x200000600000 len 8388608 PASSED 00:06:32.380 free 0x200000600000 8388608 00:06:32.380 unregister 0x200000400000 10485760 PASSED 00:06:32.380 passed 00:06:32.380 00:06:32.380 Run Summary: Type Total Ran Passed Failed Inactive 00:06:32.380 suites 1 1 n/a 0 0 00:06:32.380 tests 1 1 1 0 0 00:06:32.380 asserts 16 16 16 0 n/a 00:06:32.380 00:06:32.380 Elapsed time = 0.008 seconds 00:06:32.380 00:06:32.380 real 0m0.182s 00:06:32.380 user 0m0.040s 00:06:32.380 sys 0m0.139s 00:06:32.380 20:20:24 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:32.380 20:20:24 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:32.380 ************************************ 00:06:32.380 END TEST env_mem_callbacks 00:06:32.380 ************************************ 00:06:32.380 20:20:24 env -- common/autotest_common.sh@1142 -- # return 0 00:06:32.380 00:06:32.380 real 0m5.949s 00:06:32.380 user 0m3.672s 00:06:32.380 sys 0m1.844s 00:06:32.380 20:20:24 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:32.380 20:20:24 env -- common/autotest_common.sh@10 -- # set +x 00:06:32.381 ************************************ 00:06:32.381 END TEST env 00:06:32.381 ************************************ 00:06:32.381 20:20:24 -- common/autotest_common.sh@1142 -- # return 0 00:06:32.381 20:20:24 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:32.381 20:20:24 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:32.381 20:20:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.381 20:20:24 -- common/autotest_common.sh@10 -- # set +x 00:06:32.381 ************************************ 00:06:32.381 START TEST rpc 00:06:32.381 ************************************ 00:06:32.381 20:20:24 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:32.381 * Looking for test storage... 00:06:32.381 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:32.381 20:20:24 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1306680 00:06:32.381 20:20:24 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:32.381 20:20:24 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:32.381 20:20:24 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1306680 00:06:32.381 20:20:24 rpc -- common/autotest_common.sh@829 -- # '[' -z 1306680 ']' 00:06:32.381 20:20:24 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:32.381 20:20:24 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:32.381 20:20:24 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:32.381 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:32.381 20:20:24 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:32.381 20:20:24 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:32.640 [2024-07-15 20:20:24.813933] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:06:32.640 [2024-07-15 20:20:24.814012] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1306680 ] 00:06:32.640 [2024-07-15 20:20:24.943007] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.899 [2024-07-15 20:20:25.047688] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:32.899 [2024-07-15 20:20:25.047736] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1306680' to capture a snapshot of events at runtime. 00:06:32.899 [2024-07-15 20:20:25.047750] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:32.899 [2024-07-15 20:20:25.047764] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:32.899 [2024-07-15 20:20:25.047775] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1306680 for offline analysis/debug. 00:06:32.899 [2024-07-15 20:20:25.047806] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.465 20:20:25 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:33.465 20:20:25 rpc -- common/autotest_common.sh@862 -- # return 0 00:06:33.465 20:20:25 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:33.465 20:20:25 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:33.465 20:20:25 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:33.465 20:20:25 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:33.465 20:20:25 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:33.465 20:20:25 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.465 20:20:25 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:33.465 ************************************ 00:06:33.465 START TEST rpc_integrity 00:06:33.465 ************************************ 00:06:33.465 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:33.465 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:33.465 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.465 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:33.465 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.465 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:33.465 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:33.465 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:33.465 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:33.465 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.465 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:33.724 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.724 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:33.724 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:33.724 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.724 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:33.724 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.724 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:33.724 { 00:06:33.724 "name": "Malloc0", 00:06:33.724 "aliases": [ 00:06:33.724 "b7f325c2-ad0a-4189-92e4-9421979faa36" 00:06:33.724 ], 00:06:33.724 "product_name": "Malloc disk", 00:06:33.724 "block_size": 512, 00:06:33.724 "num_blocks": 16384, 00:06:33.724 "uuid": "b7f325c2-ad0a-4189-92e4-9421979faa36", 00:06:33.724 "assigned_rate_limits": { 00:06:33.724 "rw_ios_per_sec": 0, 00:06:33.724 "rw_mbytes_per_sec": 0, 00:06:33.724 "r_mbytes_per_sec": 0, 00:06:33.724 "w_mbytes_per_sec": 0 00:06:33.724 }, 00:06:33.724 "claimed": false, 00:06:33.724 "zoned": false, 00:06:33.724 "supported_io_types": { 00:06:33.724 "read": true, 00:06:33.724 "write": true, 00:06:33.724 "unmap": true, 00:06:33.724 "flush": true, 00:06:33.724 "reset": true, 00:06:33.724 "nvme_admin": false, 00:06:33.724 "nvme_io": false, 00:06:33.724 "nvme_io_md": false, 00:06:33.724 "write_zeroes": true, 00:06:33.724 "zcopy": true, 00:06:33.724 "get_zone_info": false, 00:06:33.724 "zone_management": false, 00:06:33.724 "zone_append": false, 00:06:33.724 "compare": false, 00:06:33.724 "compare_and_write": false, 00:06:33.724 "abort": true, 00:06:33.724 "seek_hole": false, 00:06:33.724 "seek_data": false, 00:06:33.724 "copy": true, 00:06:33.724 "nvme_iov_md": false 00:06:33.724 }, 00:06:33.724 "memory_domains": [ 00:06:33.724 { 00:06:33.724 "dma_device_id": "system", 00:06:33.724 "dma_device_type": 1 00:06:33.724 }, 00:06:33.724 { 00:06:33.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:33.724 "dma_device_type": 2 00:06:33.724 } 00:06:33.724 ], 00:06:33.724 "driver_specific": {} 00:06:33.724 } 00:06:33.724 ]' 00:06:33.724 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:33.724 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:33.724 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:33.724 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.724 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:33.724 [2024-07-15 20:20:25.928890] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:33.724 [2024-07-15 20:20:25.928939] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:33.724 [2024-07-15 20:20:25.928960] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a5eb0 00:06:33.724 [2024-07-15 20:20:25.928973] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:33.724 [2024-07-15 20:20:25.930457] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:33.724 [2024-07-15 20:20:25.930485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:33.724 Passthru0 00:06:33.724 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.724 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:33.724 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.724 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:33.724 20:20:25 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.724 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:33.724 { 00:06:33.724 "name": "Malloc0", 00:06:33.724 "aliases": [ 00:06:33.724 "b7f325c2-ad0a-4189-92e4-9421979faa36" 00:06:33.724 ], 00:06:33.724 "product_name": "Malloc disk", 00:06:33.724 "block_size": 512, 00:06:33.724 "num_blocks": 16384, 00:06:33.724 "uuid": "b7f325c2-ad0a-4189-92e4-9421979faa36", 00:06:33.724 "assigned_rate_limits": { 00:06:33.724 "rw_ios_per_sec": 0, 00:06:33.724 "rw_mbytes_per_sec": 0, 00:06:33.724 "r_mbytes_per_sec": 0, 00:06:33.724 "w_mbytes_per_sec": 0 00:06:33.724 }, 00:06:33.724 "claimed": true, 00:06:33.724 "claim_type": "exclusive_write", 00:06:33.724 "zoned": false, 00:06:33.724 "supported_io_types": { 00:06:33.724 "read": true, 00:06:33.724 "write": true, 00:06:33.724 "unmap": true, 00:06:33.724 "flush": true, 00:06:33.724 "reset": true, 00:06:33.724 "nvme_admin": false, 00:06:33.724 "nvme_io": false, 00:06:33.724 "nvme_io_md": false, 00:06:33.724 "write_zeroes": true, 00:06:33.724 "zcopy": true, 00:06:33.724 "get_zone_info": false, 00:06:33.724 "zone_management": false, 00:06:33.724 "zone_append": false, 00:06:33.724 "compare": false, 00:06:33.724 "compare_and_write": false, 00:06:33.724 "abort": true, 00:06:33.724 "seek_hole": false, 00:06:33.724 "seek_data": false, 00:06:33.724 "copy": true, 00:06:33.724 "nvme_iov_md": false 00:06:33.724 }, 00:06:33.724 "memory_domains": [ 00:06:33.724 { 00:06:33.724 "dma_device_id": "system", 00:06:33.724 "dma_device_type": 1 00:06:33.724 }, 00:06:33.724 { 00:06:33.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:33.724 "dma_device_type": 2 00:06:33.724 } 00:06:33.724 ], 00:06:33.724 "driver_specific": {} 00:06:33.724 }, 00:06:33.724 { 00:06:33.724 "name": "Passthru0", 00:06:33.724 "aliases": [ 00:06:33.724 "07a69963-9445-5e05-8327-e8b9069b3f91" 00:06:33.724 ], 00:06:33.724 "product_name": "passthru", 00:06:33.724 "block_size": 512, 00:06:33.724 "num_blocks": 16384, 00:06:33.725 "uuid": "07a69963-9445-5e05-8327-e8b9069b3f91", 00:06:33.725 "assigned_rate_limits": { 00:06:33.725 "rw_ios_per_sec": 0, 00:06:33.725 "rw_mbytes_per_sec": 0, 00:06:33.725 "r_mbytes_per_sec": 0, 00:06:33.725 "w_mbytes_per_sec": 0 00:06:33.725 }, 00:06:33.725 "claimed": false, 00:06:33.725 "zoned": false, 00:06:33.725 "supported_io_types": { 00:06:33.725 "read": true, 00:06:33.725 "write": true, 00:06:33.725 "unmap": true, 00:06:33.725 "flush": true, 00:06:33.725 "reset": true, 00:06:33.725 "nvme_admin": false, 00:06:33.725 "nvme_io": false, 00:06:33.725 "nvme_io_md": false, 00:06:33.725 "write_zeroes": true, 00:06:33.725 "zcopy": true, 00:06:33.725 "get_zone_info": false, 00:06:33.725 "zone_management": false, 00:06:33.725 "zone_append": false, 00:06:33.725 "compare": false, 00:06:33.725 "compare_and_write": false, 00:06:33.725 "abort": true, 00:06:33.725 "seek_hole": false, 00:06:33.725 "seek_data": false, 00:06:33.725 "copy": true, 00:06:33.725 "nvme_iov_md": false 00:06:33.725 }, 00:06:33.725 "memory_domains": [ 00:06:33.725 { 00:06:33.725 "dma_device_id": "system", 00:06:33.725 "dma_device_type": 1 00:06:33.725 }, 00:06:33.725 { 00:06:33.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:33.725 "dma_device_type": 2 00:06:33.725 } 00:06:33.725 ], 00:06:33.725 "driver_specific": { 00:06:33.725 "passthru": { 00:06:33.725 "name": "Passthru0", 00:06:33.725 "base_bdev_name": "Malloc0" 00:06:33.725 } 00:06:33.725 } 00:06:33.725 } 00:06:33.725 ]' 00:06:33.725 20:20:25 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:33.725 20:20:26 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:33.725 20:20:26 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.725 20:20:26 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.725 20:20:26 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.725 20:20:26 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:33.725 20:20:26 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:33.725 20:20:26 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:33.725 00:06:33.725 real 0m0.307s 00:06:33.725 user 0m0.187s 00:06:33.725 sys 0m0.055s 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:33.725 20:20:26 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:33.725 ************************************ 00:06:33.725 END TEST rpc_integrity 00:06:33.725 ************************************ 00:06:33.984 20:20:26 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:33.984 20:20:26 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:33.984 20:20:26 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:33.984 20:20:26 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.984 20:20:26 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:33.984 ************************************ 00:06:33.984 START TEST rpc_plugins 00:06:33.984 ************************************ 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:33.984 { 00:06:33.984 "name": "Malloc1", 00:06:33.984 "aliases": [ 00:06:33.984 "a5ef8433-2875-4547-a15c-6a0f938801a4" 00:06:33.984 ], 00:06:33.984 "product_name": "Malloc disk", 00:06:33.984 "block_size": 4096, 00:06:33.984 "num_blocks": 256, 00:06:33.984 "uuid": "a5ef8433-2875-4547-a15c-6a0f938801a4", 00:06:33.984 "assigned_rate_limits": { 00:06:33.984 "rw_ios_per_sec": 0, 00:06:33.984 "rw_mbytes_per_sec": 0, 00:06:33.984 "r_mbytes_per_sec": 0, 00:06:33.984 "w_mbytes_per_sec": 0 00:06:33.984 }, 00:06:33.984 "claimed": false, 00:06:33.984 "zoned": false, 00:06:33.984 "supported_io_types": { 00:06:33.984 "read": true, 00:06:33.984 "write": true, 00:06:33.984 "unmap": true, 00:06:33.984 "flush": true, 00:06:33.984 "reset": true, 00:06:33.984 "nvme_admin": false, 00:06:33.984 "nvme_io": false, 00:06:33.984 "nvme_io_md": false, 00:06:33.984 "write_zeroes": true, 00:06:33.984 "zcopy": true, 00:06:33.984 "get_zone_info": false, 00:06:33.984 "zone_management": false, 00:06:33.984 "zone_append": false, 00:06:33.984 "compare": false, 00:06:33.984 "compare_and_write": false, 00:06:33.984 "abort": true, 00:06:33.984 "seek_hole": false, 00:06:33.984 "seek_data": false, 00:06:33.984 "copy": true, 00:06:33.984 "nvme_iov_md": false 00:06:33.984 }, 00:06:33.984 "memory_domains": [ 00:06:33.984 { 00:06:33.984 "dma_device_id": "system", 00:06:33.984 "dma_device_type": 1 00:06:33.984 }, 00:06:33.984 { 00:06:33.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:33.984 "dma_device_type": 2 00:06:33.984 } 00:06:33.984 ], 00:06:33.984 "driver_specific": {} 00:06:33.984 } 00:06:33.984 ]' 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:33.984 20:20:26 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:33.984 00:06:33.984 real 0m0.155s 00:06:33.984 user 0m0.090s 00:06:33.984 sys 0m0.030s 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:33.984 20:20:26 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:33.984 ************************************ 00:06:33.984 END TEST rpc_plugins 00:06:33.984 ************************************ 00:06:34.243 20:20:26 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:34.243 20:20:26 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:34.243 20:20:26 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:34.243 20:20:26 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.243 20:20:26 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:34.243 ************************************ 00:06:34.243 START TEST rpc_trace_cmd_test 00:06:34.243 ************************************ 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:34.243 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1306680", 00:06:34.243 "tpoint_group_mask": "0x8", 00:06:34.243 "iscsi_conn": { 00:06:34.243 "mask": "0x2", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "scsi": { 00:06:34.243 "mask": "0x4", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "bdev": { 00:06:34.243 "mask": "0x8", 00:06:34.243 "tpoint_mask": "0xffffffffffffffff" 00:06:34.243 }, 00:06:34.243 "nvmf_rdma": { 00:06:34.243 "mask": "0x10", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "nvmf_tcp": { 00:06:34.243 "mask": "0x20", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "ftl": { 00:06:34.243 "mask": "0x40", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "blobfs": { 00:06:34.243 "mask": "0x80", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "dsa": { 00:06:34.243 "mask": "0x200", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "thread": { 00:06:34.243 "mask": "0x400", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "nvme_pcie": { 00:06:34.243 "mask": "0x800", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "iaa": { 00:06:34.243 "mask": "0x1000", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "nvme_tcp": { 00:06:34.243 "mask": "0x2000", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "bdev_nvme": { 00:06:34.243 "mask": "0x4000", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 }, 00:06:34.243 "sock": { 00:06:34.243 "mask": "0x8000", 00:06:34.243 "tpoint_mask": "0x0" 00:06:34.243 } 00:06:34.243 }' 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:34.243 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:34.502 20:20:26 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:34.502 00:06:34.502 real 0m0.246s 00:06:34.502 user 0m0.202s 00:06:34.502 sys 0m0.036s 00:06:34.502 20:20:26 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.502 20:20:26 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:34.502 ************************************ 00:06:34.502 END TEST rpc_trace_cmd_test 00:06:34.502 ************************************ 00:06:34.502 20:20:26 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:34.502 20:20:26 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:34.502 20:20:26 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:34.502 20:20:26 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:34.502 20:20:26 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:34.502 20:20:26 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.502 20:20:26 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:34.502 ************************************ 00:06:34.502 START TEST rpc_daemon_integrity 00:06:34.502 ************************************ 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:34.502 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:34.502 { 00:06:34.502 "name": "Malloc2", 00:06:34.502 "aliases": [ 00:06:34.502 "8432a275-a982-4fcd-a5bf-301f5780918d" 00:06:34.502 ], 00:06:34.502 "product_name": "Malloc disk", 00:06:34.502 "block_size": 512, 00:06:34.503 "num_blocks": 16384, 00:06:34.503 "uuid": "8432a275-a982-4fcd-a5bf-301f5780918d", 00:06:34.503 "assigned_rate_limits": { 00:06:34.503 "rw_ios_per_sec": 0, 00:06:34.503 "rw_mbytes_per_sec": 0, 00:06:34.503 "r_mbytes_per_sec": 0, 00:06:34.503 "w_mbytes_per_sec": 0 00:06:34.503 }, 00:06:34.503 "claimed": false, 00:06:34.503 "zoned": false, 00:06:34.503 "supported_io_types": { 00:06:34.503 "read": true, 00:06:34.503 "write": true, 00:06:34.503 "unmap": true, 00:06:34.503 "flush": true, 00:06:34.503 "reset": true, 00:06:34.503 "nvme_admin": false, 00:06:34.503 "nvme_io": false, 00:06:34.503 "nvme_io_md": false, 00:06:34.503 "write_zeroes": true, 00:06:34.503 "zcopy": true, 00:06:34.503 "get_zone_info": false, 00:06:34.503 "zone_management": false, 00:06:34.503 "zone_append": false, 00:06:34.503 "compare": false, 00:06:34.503 "compare_and_write": false, 00:06:34.503 "abort": true, 00:06:34.503 "seek_hole": false, 00:06:34.503 "seek_data": false, 00:06:34.503 "copy": true, 00:06:34.503 "nvme_iov_md": false 00:06:34.503 }, 00:06:34.503 "memory_domains": [ 00:06:34.503 { 00:06:34.503 "dma_device_id": "system", 00:06:34.503 "dma_device_type": 1 00:06:34.503 }, 00:06:34.503 { 00:06:34.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:34.503 "dma_device_type": 2 00:06:34.503 } 00:06:34.503 ], 00:06:34.503 "driver_specific": {} 00:06:34.503 } 00:06:34.503 ]' 00:06:34.503 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:34.503 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:34.503 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:34.503 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.762 [2024-07-15 20:20:26.887597] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:34.762 [2024-07-15 20:20:26.887633] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:34.762 [2024-07-15 20:20:26.887657] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a6b20 00:06:34.762 [2024-07-15 20:20:26.887669] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:34.762 [2024-07-15 20:20:26.889037] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:34.762 [2024-07-15 20:20:26.889065] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:34.762 Passthru0 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:34.762 { 00:06:34.762 "name": "Malloc2", 00:06:34.762 "aliases": [ 00:06:34.762 "8432a275-a982-4fcd-a5bf-301f5780918d" 00:06:34.762 ], 00:06:34.762 "product_name": "Malloc disk", 00:06:34.762 "block_size": 512, 00:06:34.762 "num_blocks": 16384, 00:06:34.762 "uuid": "8432a275-a982-4fcd-a5bf-301f5780918d", 00:06:34.762 "assigned_rate_limits": { 00:06:34.762 "rw_ios_per_sec": 0, 00:06:34.762 "rw_mbytes_per_sec": 0, 00:06:34.762 "r_mbytes_per_sec": 0, 00:06:34.762 "w_mbytes_per_sec": 0 00:06:34.762 }, 00:06:34.762 "claimed": true, 00:06:34.762 "claim_type": "exclusive_write", 00:06:34.762 "zoned": false, 00:06:34.762 "supported_io_types": { 00:06:34.762 "read": true, 00:06:34.762 "write": true, 00:06:34.762 "unmap": true, 00:06:34.762 "flush": true, 00:06:34.762 "reset": true, 00:06:34.762 "nvme_admin": false, 00:06:34.762 "nvme_io": false, 00:06:34.762 "nvme_io_md": false, 00:06:34.762 "write_zeroes": true, 00:06:34.762 "zcopy": true, 00:06:34.762 "get_zone_info": false, 00:06:34.762 "zone_management": false, 00:06:34.762 "zone_append": false, 00:06:34.762 "compare": false, 00:06:34.762 "compare_and_write": false, 00:06:34.762 "abort": true, 00:06:34.762 "seek_hole": false, 00:06:34.762 "seek_data": false, 00:06:34.762 "copy": true, 00:06:34.762 "nvme_iov_md": false 00:06:34.762 }, 00:06:34.762 "memory_domains": [ 00:06:34.762 { 00:06:34.762 "dma_device_id": "system", 00:06:34.762 "dma_device_type": 1 00:06:34.762 }, 00:06:34.762 { 00:06:34.762 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:34.762 "dma_device_type": 2 00:06:34.762 } 00:06:34.762 ], 00:06:34.762 "driver_specific": {} 00:06:34.762 }, 00:06:34.762 { 00:06:34.762 "name": "Passthru0", 00:06:34.762 "aliases": [ 00:06:34.762 "bce369da-8346-5dfc-8441-c31e639a711d" 00:06:34.762 ], 00:06:34.762 "product_name": "passthru", 00:06:34.762 "block_size": 512, 00:06:34.762 "num_blocks": 16384, 00:06:34.762 "uuid": "bce369da-8346-5dfc-8441-c31e639a711d", 00:06:34.762 "assigned_rate_limits": { 00:06:34.762 "rw_ios_per_sec": 0, 00:06:34.762 "rw_mbytes_per_sec": 0, 00:06:34.762 "r_mbytes_per_sec": 0, 00:06:34.762 "w_mbytes_per_sec": 0 00:06:34.762 }, 00:06:34.762 "claimed": false, 00:06:34.762 "zoned": false, 00:06:34.762 "supported_io_types": { 00:06:34.762 "read": true, 00:06:34.762 "write": true, 00:06:34.762 "unmap": true, 00:06:34.762 "flush": true, 00:06:34.762 "reset": true, 00:06:34.762 "nvme_admin": false, 00:06:34.762 "nvme_io": false, 00:06:34.762 "nvme_io_md": false, 00:06:34.762 "write_zeroes": true, 00:06:34.762 "zcopy": true, 00:06:34.762 "get_zone_info": false, 00:06:34.762 "zone_management": false, 00:06:34.762 "zone_append": false, 00:06:34.762 "compare": false, 00:06:34.762 "compare_and_write": false, 00:06:34.762 "abort": true, 00:06:34.762 "seek_hole": false, 00:06:34.762 "seek_data": false, 00:06:34.762 "copy": true, 00:06:34.762 "nvme_iov_md": false 00:06:34.762 }, 00:06:34.762 "memory_domains": [ 00:06:34.762 { 00:06:34.762 "dma_device_id": "system", 00:06:34.762 "dma_device_type": 1 00:06:34.762 }, 00:06:34.762 { 00:06:34.762 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:34.762 "dma_device_type": 2 00:06:34.762 } 00:06:34.762 ], 00:06:34.762 "driver_specific": { 00:06:34.762 "passthru": { 00:06:34.762 "name": "Passthru0", 00:06:34.762 "base_bdev_name": "Malloc2" 00:06:34.762 } 00:06:34.762 } 00:06:34.762 } 00:06:34.762 ]' 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:34.762 20:20:26 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.762 20:20:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:34.762 20:20:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:34.762 20:20:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:34.762 20:20:27 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:34.762 00:06:34.762 real 0m0.310s 00:06:34.762 user 0m0.197s 00:06:34.762 sys 0m0.050s 00:06:34.762 20:20:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.762 20:20:27 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:34.762 ************************************ 00:06:34.762 END TEST rpc_daemon_integrity 00:06:34.762 ************************************ 00:06:34.762 20:20:27 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:34.762 20:20:27 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:34.762 20:20:27 rpc -- rpc/rpc.sh@84 -- # killprocess 1306680 00:06:34.762 20:20:27 rpc -- common/autotest_common.sh@948 -- # '[' -z 1306680 ']' 00:06:34.762 20:20:27 rpc -- common/autotest_common.sh@952 -- # kill -0 1306680 00:06:34.762 20:20:27 rpc -- common/autotest_common.sh@953 -- # uname 00:06:34.762 20:20:27 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:34.762 20:20:27 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1306680 00:06:35.022 20:20:27 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:35.022 20:20:27 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:35.022 20:20:27 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1306680' 00:06:35.022 killing process with pid 1306680 00:06:35.022 20:20:27 rpc -- common/autotest_common.sh@967 -- # kill 1306680 00:06:35.022 20:20:27 rpc -- common/autotest_common.sh@972 -- # wait 1306680 00:06:35.280 00:06:35.280 real 0m2.866s 00:06:35.280 user 0m3.633s 00:06:35.280 sys 0m0.959s 00:06:35.280 20:20:27 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:35.280 20:20:27 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.280 ************************************ 00:06:35.280 END TEST rpc 00:06:35.280 ************************************ 00:06:35.280 20:20:27 -- common/autotest_common.sh@1142 -- # return 0 00:06:35.280 20:20:27 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:35.280 20:20:27 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:35.280 20:20:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.280 20:20:27 -- common/autotest_common.sh@10 -- # set +x 00:06:35.280 ************************************ 00:06:35.280 START TEST skip_rpc 00:06:35.280 ************************************ 00:06:35.280 20:20:27 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:35.539 * Looking for test storage... 00:06:35.539 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:35.539 20:20:27 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:35.539 20:20:27 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:35.539 20:20:27 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:35.539 20:20:27 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:35.539 20:20:27 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.539 20:20:27 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.539 ************************************ 00:06:35.539 START TEST skip_rpc 00:06:35.539 ************************************ 00:06:35.539 20:20:27 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:06:35.539 20:20:27 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1307328 00:06:35.539 20:20:27 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:35.539 20:20:27 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:35.539 20:20:27 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:35.539 [2024-07-15 20:20:27.820549] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:06:35.539 [2024-07-15 20:20:27.820618] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1307328 ] 00:06:35.798 [2024-07-15 20:20:27.952199] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.799 [2024-07-15 20:20:28.052002] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1307328 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 1307328 ']' 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 1307328 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1307328 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1307328' 00:06:41.105 killing process with pid 1307328 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 1307328 00:06:41.105 20:20:32 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 1307328 00:06:41.105 00:06:41.105 real 0m5.459s 00:06:41.105 user 0m5.090s 00:06:41.105 sys 0m0.390s 00:06:41.105 20:20:33 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:41.105 20:20:33 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:41.105 ************************************ 00:06:41.105 END TEST skip_rpc 00:06:41.105 ************************************ 00:06:41.105 20:20:33 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:41.105 20:20:33 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:41.105 20:20:33 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:41.105 20:20:33 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:41.105 20:20:33 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:41.105 ************************************ 00:06:41.105 START TEST skip_rpc_with_json 00:06:41.105 ************************************ 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1308058 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1308058 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 1308058 ']' 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:41.105 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:41.105 20:20:33 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:41.105 [2024-07-15 20:20:33.364440] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:06:41.105 [2024-07-15 20:20:33.364509] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1308058 ] 00:06:41.363 [2024-07-15 20:20:33.494865] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.363 [2024-07-15 20:20:33.600221] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:41.930 [2024-07-15 20:20:34.287186] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:41.930 request: 00:06:41.930 { 00:06:41.930 "trtype": "tcp", 00:06:41.930 "method": "nvmf_get_transports", 00:06:41.930 "req_id": 1 00:06:41.930 } 00:06:41.930 Got JSON-RPC error response 00:06:41.930 response: 00:06:41.930 { 00:06:41.930 "code": -19, 00:06:41.930 "message": "No such device" 00:06:41.930 } 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:41.930 [2024-07-15 20:20:34.299339] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:41.930 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:42.189 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:42.189 20:20:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:42.189 { 00:06:42.189 "subsystems": [ 00:06:42.189 { 00:06:42.189 "subsystem": "keyring", 00:06:42.189 "config": [] 00:06:42.189 }, 00:06:42.189 { 00:06:42.189 "subsystem": "iobuf", 00:06:42.189 "config": [ 00:06:42.189 { 00:06:42.189 "method": "iobuf_set_options", 00:06:42.189 "params": { 00:06:42.189 "small_pool_count": 8192, 00:06:42.189 "large_pool_count": 1024, 00:06:42.189 "small_bufsize": 8192, 00:06:42.189 "large_bufsize": 135168 00:06:42.189 } 00:06:42.189 } 00:06:42.189 ] 00:06:42.189 }, 00:06:42.189 { 00:06:42.189 "subsystem": "sock", 00:06:42.189 "config": [ 00:06:42.189 { 00:06:42.189 "method": "sock_set_default_impl", 00:06:42.189 "params": { 00:06:42.189 "impl_name": "posix" 00:06:42.189 } 00:06:42.189 }, 00:06:42.189 { 00:06:42.189 "method": "sock_impl_set_options", 00:06:42.189 "params": { 00:06:42.189 "impl_name": "ssl", 00:06:42.189 "recv_buf_size": 4096, 00:06:42.189 "send_buf_size": 4096, 00:06:42.189 "enable_recv_pipe": true, 00:06:42.189 "enable_quickack": false, 00:06:42.189 "enable_placement_id": 0, 00:06:42.189 "enable_zerocopy_send_server": true, 00:06:42.190 "enable_zerocopy_send_client": false, 00:06:42.190 "zerocopy_threshold": 0, 00:06:42.190 "tls_version": 0, 00:06:42.190 "enable_ktls": false 00:06:42.190 } 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "method": "sock_impl_set_options", 00:06:42.190 "params": { 00:06:42.190 "impl_name": "posix", 00:06:42.190 "recv_buf_size": 2097152, 00:06:42.190 "send_buf_size": 2097152, 00:06:42.190 "enable_recv_pipe": true, 00:06:42.190 "enable_quickack": false, 00:06:42.190 "enable_placement_id": 0, 00:06:42.190 "enable_zerocopy_send_server": true, 00:06:42.190 "enable_zerocopy_send_client": false, 00:06:42.190 "zerocopy_threshold": 0, 00:06:42.190 "tls_version": 0, 00:06:42.190 "enable_ktls": false 00:06:42.190 } 00:06:42.190 } 00:06:42.190 ] 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "vmd", 00:06:42.190 "config": [] 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "accel", 00:06:42.190 "config": [ 00:06:42.190 { 00:06:42.190 "method": "accel_set_options", 00:06:42.190 "params": { 00:06:42.190 "small_cache_size": 128, 00:06:42.190 "large_cache_size": 16, 00:06:42.190 "task_count": 2048, 00:06:42.190 "sequence_count": 2048, 00:06:42.190 "buf_count": 2048 00:06:42.190 } 00:06:42.190 } 00:06:42.190 ] 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "bdev", 00:06:42.190 "config": [ 00:06:42.190 { 00:06:42.190 "method": "bdev_set_options", 00:06:42.190 "params": { 00:06:42.190 "bdev_io_pool_size": 65535, 00:06:42.190 "bdev_io_cache_size": 256, 00:06:42.190 "bdev_auto_examine": true, 00:06:42.190 "iobuf_small_cache_size": 128, 00:06:42.190 "iobuf_large_cache_size": 16 00:06:42.190 } 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "method": "bdev_raid_set_options", 00:06:42.190 "params": { 00:06:42.190 "process_window_size_kb": 1024 00:06:42.190 } 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "method": "bdev_iscsi_set_options", 00:06:42.190 "params": { 00:06:42.190 "timeout_sec": 30 00:06:42.190 } 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "method": "bdev_nvme_set_options", 00:06:42.190 "params": { 00:06:42.190 "action_on_timeout": "none", 00:06:42.190 "timeout_us": 0, 00:06:42.190 "timeout_admin_us": 0, 00:06:42.190 "keep_alive_timeout_ms": 10000, 00:06:42.190 "arbitration_burst": 0, 00:06:42.190 "low_priority_weight": 0, 00:06:42.190 "medium_priority_weight": 0, 00:06:42.190 "high_priority_weight": 0, 00:06:42.190 "nvme_adminq_poll_period_us": 10000, 00:06:42.190 "nvme_ioq_poll_period_us": 0, 00:06:42.190 "io_queue_requests": 0, 00:06:42.190 "delay_cmd_submit": true, 00:06:42.190 "transport_retry_count": 4, 00:06:42.190 "bdev_retry_count": 3, 00:06:42.190 "transport_ack_timeout": 0, 00:06:42.190 "ctrlr_loss_timeout_sec": 0, 00:06:42.190 "reconnect_delay_sec": 0, 00:06:42.190 "fast_io_fail_timeout_sec": 0, 00:06:42.190 "disable_auto_failback": false, 00:06:42.190 "generate_uuids": false, 00:06:42.190 "transport_tos": 0, 00:06:42.190 "nvme_error_stat": false, 00:06:42.190 "rdma_srq_size": 0, 00:06:42.190 "io_path_stat": false, 00:06:42.190 "allow_accel_sequence": false, 00:06:42.190 "rdma_max_cq_size": 0, 00:06:42.190 "rdma_cm_event_timeout_ms": 0, 00:06:42.190 "dhchap_digests": [ 00:06:42.190 "sha256", 00:06:42.190 "sha384", 00:06:42.190 "sha512" 00:06:42.190 ], 00:06:42.190 "dhchap_dhgroups": [ 00:06:42.190 "null", 00:06:42.190 "ffdhe2048", 00:06:42.190 "ffdhe3072", 00:06:42.190 "ffdhe4096", 00:06:42.190 "ffdhe6144", 00:06:42.190 "ffdhe8192" 00:06:42.190 ] 00:06:42.190 } 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "method": "bdev_nvme_set_hotplug", 00:06:42.190 "params": { 00:06:42.190 "period_us": 100000, 00:06:42.190 "enable": false 00:06:42.190 } 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "method": "bdev_wait_for_examine" 00:06:42.190 } 00:06:42.190 ] 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "scsi", 00:06:42.190 "config": null 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "scheduler", 00:06:42.190 "config": [ 00:06:42.190 { 00:06:42.190 "method": "framework_set_scheduler", 00:06:42.190 "params": { 00:06:42.190 "name": "static" 00:06:42.190 } 00:06:42.190 } 00:06:42.190 ] 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "vhost_scsi", 00:06:42.190 "config": [] 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "vhost_blk", 00:06:42.190 "config": [] 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "ublk", 00:06:42.190 "config": [] 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "nbd", 00:06:42.190 "config": [] 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "nvmf", 00:06:42.190 "config": [ 00:06:42.190 { 00:06:42.190 "method": "nvmf_set_config", 00:06:42.190 "params": { 00:06:42.190 "discovery_filter": "match_any", 00:06:42.190 "admin_cmd_passthru": { 00:06:42.190 "identify_ctrlr": false 00:06:42.190 } 00:06:42.190 } 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "method": "nvmf_set_max_subsystems", 00:06:42.190 "params": { 00:06:42.190 "max_subsystems": 1024 00:06:42.190 } 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "method": "nvmf_set_crdt", 00:06:42.190 "params": { 00:06:42.190 "crdt1": 0, 00:06:42.190 "crdt2": 0, 00:06:42.190 "crdt3": 0 00:06:42.190 } 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "method": "nvmf_create_transport", 00:06:42.190 "params": { 00:06:42.190 "trtype": "TCP", 00:06:42.190 "max_queue_depth": 128, 00:06:42.190 "max_io_qpairs_per_ctrlr": 127, 00:06:42.190 "in_capsule_data_size": 4096, 00:06:42.190 "max_io_size": 131072, 00:06:42.190 "io_unit_size": 131072, 00:06:42.190 "max_aq_depth": 128, 00:06:42.190 "num_shared_buffers": 511, 00:06:42.190 "buf_cache_size": 4294967295, 00:06:42.190 "dif_insert_or_strip": false, 00:06:42.190 "zcopy": false, 00:06:42.190 "c2h_success": true, 00:06:42.190 "sock_priority": 0, 00:06:42.190 "abort_timeout_sec": 1, 00:06:42.190 "ack_timeout": 0, 00:06:42.190 "data_wr_pool_size": 0 00:06:42.190 } 00:06:42.190 } 00:06:42.190 ] 00:06:42.190 }, 00:06:42.190 { 00:06:42.190 "subsystem": "iscsi", 00:06:42.190 "config": [ 00:06:42.190 { 00:06:42.190 "method": "iscsi_set_options", 00:06:42.190 "params": { 00:06:42.190 "node_base": "iqn.2016-06.io.spdk", 00:06:42.190 "max_sessions": 128, 00:06:42.190 "max_connections_per_session": 2, 00:06:42.190 "max_queue_depth": 64, 00:06:42.190 "default_time2wait": 2, 00:06:42.190 "default_time2retain": 20, 00:06:42.190 "first_burst_length": 8192, 00:06:42.190 "immediate_data": true, 00:06:42.190 "allow_duplicated_isid": false, 00:06:42.190 "error_recovery_level": 0, 00:06:42.190 "nop_timeout": 60, 00:06:42.190 "nop_in_interval": 30, 00:06:42.190 "disable_chap": false, 00:06:42.190 "require_chap": false, 00:06:42.190 "mutual_chap": false, 00:06:42.190 "chap_group": 0, 00:06:42.190 "max_large_datain_per_connection": 64, 00:06:42.190 "max_r2t_per_connection": 4, 00:06:42.190 "pdu_pool_size": 36864, 00:06:42.190 "immediate_data_pool_size": 16384, 00:06:42.190 "data_out_pool_size": 2048 00:06:42.190 } 00:06:42.190 } 00:06:42.190 ] 00:06:42.190 } 00:06:42.190 ] 00:06:42.190 } 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1308058 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1308058 ']' 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1308058 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1308058 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1308058' 00:06:42.190 killing process with pid 1308058 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1308058 00:06:42.190 20:20:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1308058 00:06:42.758 20:20:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1308252 00:06:42.758 20:20:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:42.758 20:20:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1308252 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1308252 ']' 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1308252 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1308252 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1308252' 00:06:48.035 killing process with pid 1308252 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1308252 00:06:48.035 20:20:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1308252 00:06:48.035 20:20:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:48.035 20:20:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:48.035 00:06:48.035 real 0m7.058s 00:06:48.035 user 0m6.764s 00:06:48.035 sys 0m0.858s 00:06:48.035 20:20:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:48.035 20:20:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:48.035 ************************************ 00:06:48.035 END TEST skip_rpc_with_json 00:06:48.035 ************************************ 00:06:48.035 20:20:40 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:48.035 20:20:40 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:48.035 20:20:40 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:48.035 20:20:40 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:48.035 20:20:40 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:48.295 ************************************ 00:06:48.295 START TEST skip_rpc_with_delay 00:06:48.295 ************************************ 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:48.295 [2024-07-15 20:20:40.503088] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:48.295 [2024-07-15 20:20:40.503184] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:48.295 00:06:48.295 real 0m0.095s 00:06:48.295 user 0m0.054s 00:06:48.295 sys 0m0.041s 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:48.295 20:20:40 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:48.295 ************************************ 00:06:48.295 END TEST skip_rpc_with_delay 00:06:48.295 ************************************ 00:06:48.295 20:20:40 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:48.295 20:20:40 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:48.295 20:20:40 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:48.295 20:20:40 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:48.295 20:20:40 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:48.295 20:20:40 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:48.295 20:20:40 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:48.295 ************************************ 00:06:48.295 START TEST exit_on_failed_rpc_init 00:06:48.295 ************************************ 00:06:48.295 20:20:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:06:48.295 20:20:40 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1309001 00:06:48.295 20:20:40 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1309001 00:06:48.295 20:20:40 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:48.295 20:20:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 1309001 ']' 00:06:48.295 20:20:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:48.295 20:20:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:48.296 20:20:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:48.296 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:48.296 20:20:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:48.296 20:20:40 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:48.556 [2024-07-15 20:20:40.678248] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:06:48.556 [2024-07-15 20:20:40.678320] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1309001 ] 00:06:48.556 [2024-07-15 20:20:40.808430] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.556 [2024-07-15 20:20:40.915061] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:49.495 20:20:41 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:49.495 [2024-07-15 20:20:41.674542] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:06:49.495 [2024-07-15 20:20:41.674609] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1309182 ] 00:06:49.495 [2024-07-15 20:20:41.808135] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.755 [2024-07-15 20:20:41.920355] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:49.755 [2024-07-15 20:20:41.920448] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:49.755 [2024-07-15 20:20:41.920469] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:49.755 [2024-07-15 20:20:41.920485] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1309001 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 1309001 ']' 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 1309001 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1309001 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:49.755 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:49.756 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1309001' 00:06:49.756 killing process with pid 1309001 00:06:49.756 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 1309001 00:06:49.756 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 1309001 00:06:50.337 00:06:50.337 real 0m1.849s 00:06:50.337 user 0m2.178s 00:06:50.337 sys 0m0.606s 00:06:50.337 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:50.337 20:20:42 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:50.337 ************************************ 00:06:50.337 END TEST exit_on_failed_rpc_init 00:06:50.337 ************************************ 00:06:50.337 20:20:42 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:50.337 20:20:42 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:50.337 00:06:50.337 real 0m14.904s 00:06:50.337 user 0m14.229s 00:06:50.337 sys 0m2.228s 00:06:50.337 20:20:42 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:50.337 20:20:42 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.337 ************************************ 00:06:50.337 END TEST skip_rpc 00:06:50.337 ************************************ 00:06:50.337 20:20:42 -- common/autotest_common.sh@1142 -- # return 0 00:06:50.337 20:20:42 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:50.337 20:20:42 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:50.337 20:20:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.337 20:20:42 -- common/autotest_common.sh@10 -- # set +x 00:06:50.337 ************************************ 00:06:50.337 START TEST rpc_client 00:06:50.337 ************************************ 00:06:50.337 20:20:42 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:50.337 * Looking for test storage... 00:06:50.337 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:50.337 20:20:42 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:50.596 OK 00:06:50.596 20:20:42 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:50.596 00:06:50.596 real 0m0.137s 00:06:50.596 user 0m0.054s 00:06:50.596 sys 0m0.094s 00:06:50.596 20:20:42 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:50.596 20:20:42 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:50.596 ************************************ 00:06:50.596 END TEST rpc_client 00:06:50.596 ************************************ 00:06:50.596 20:20:42 -- common/autotest_common.sh@1142 -- # return 0 00:06:50.596 20:20:42 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:50.596 20:20:42 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:50.596 20:20:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.596 20:20:42 -- common/autotest_common.sh@10 -- # set +x 00:06:50.596 ************************************ 00:06:50.596 START TEST json_config 00:06:50.596 ************************************ 00:06:50.596 20:20:42 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:50.596 20:20:42 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:50.596 20:20:42 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:50.597 20:20:42 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:50.597 20:20:42 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:50.597 20:20:42 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:50.597 20:20:42 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.597 20:20:42 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.597 20:20:42 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.597 20:20:42 json_config -- paths/export.sh@5 -- # export PATH 00:06:50.597 20:20:42 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@47 -- # : 0 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:50.597 20:20:42 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:50.597 INFO: JSON configuration test init 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:50.597 20:20:42 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:50.597 20:20:42 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:50.597 20:20:42 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:50.597 20:20:42 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:50.597 20:20:42 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:50.597 20:20:42 json_config -- json_config/common.sh@9 -- # local app=target 00:06:50.597 20:20:42 json_config -- json_config/common.sh@10 -- # shift 00:06:50.597 20:20:42 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:50.597 20:20:42 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:50.597 20:20:42 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:50.597 20:20:42 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:50.597 20:20:42 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:50.597 20:20:42 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1309464 00:06:50.597 20:20:42 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:50.597 Waiting for target to run... 00:06:50.597 20:20:42 json_config -- json_config/common.sh@25 -- # waitforlisten 1309464 /var/tmp/spdk_tgt.sock 00:06:50.597 20:20:42 json_config -- common/autotest_common.sh@829 -- # '[' -z 1309464 ']' 00:06:50.597 20:20:42 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:50.597 20:20:42 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:50.597 20:20:42 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:50.597 20:20:42 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:50.597 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:50.597 20:20:42 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:50.597 20:20:42 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:50.856 [2024-07-15 20:20:43.005324] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:06:50.856 [2024-07-15 20:20:43.005403] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1309464 ] 00:06:51.424 [2024-07-15 20:20:43.588569] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.424 [2024-07-15 20:20:43.688365] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.683 20:20:43 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:51.683 20:20:43 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:51.683 20:20:43 json_config -- json_config/common.sh@26 -- # echo '' 00:06:51.683 00:06:51.683 20:20:43 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:51.683 20:20:43 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:51.683 20:20:43 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:51.683 20:20:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.683 20:20:43 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:51.683 20:20:43 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:51.683 20:20:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:51.943 20:20:44 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:51.943 20:20:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:52.202 [2024-07-15 20:20:44.406589] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:52.202 20:20:44 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:52.202 20:20:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:52.461 [2024-07-15 20:20:44.647206] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:52.461 20:20:44 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:52.461 20:20:44 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:52.461 20:20:44 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:52.461 20:20:44 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:52.461 20:20:44 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:52.461 20:20:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:52.720 [2024-07-15 20:20:44.956701] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:55.259 20:20:47 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:55.259 20:20:47 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:55.259 20:20:47 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:55.259 20:20:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:55.259 20:20:47 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:55.259 20:20:47 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:55.259 20:20:47 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:55.259 20:20:47 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:55.259 20:20:47 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:55.259 20:20:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:55.519 20:20:47 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:55.519 20:20:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:55.519 20:20:47 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:55.519 20:20:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:55.519 20:20:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:55.519 20:20:47 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:55.778 20:20:48 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:55.778 20:20:48 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:55.778 20:20:48 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:55.778 20:20:48 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:55.778 20:20:48 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:55.778 20:20:48 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:55.778 20:20:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:56.347 Nvme0n1p0 Nvme0n1p1 00:06:56.347 20:20:48 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:56.347 20:20:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:56.606 [2024-07-15 20:20:48.843466] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:56.606 [2024-07-15 20:20:48.843523] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:56.606 00:06:56.606 20:20:48 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:56.606 20:20:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:57.175 Malloc3 00:06:57.175 20:20:49 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:57.175 20:20:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:57.745 [2024-07-15 20:20:49.854335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:57.745 [2024-07-15 20:20:49.854389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:57.745 [2024-07-15 20:20:49.854415] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8d9660 00:06:57.745 [2024-07-15 20:20:49.854427] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:57.745 [2024-07-15 20:20:49.856007] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:57.745 [2024-07-15 20:20:49.856037] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:57.745 PTBdevFromMalloc3 00:06:57.745 20:20:49 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:57.745 20:20:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:57.745 Null0 00:06:58.008 20:20:50 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:58.008 20:20:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:58.008 Malloc0 00:06:58.008 20:20:50 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:58.008 20:20:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:58.613 Malloc1 00:06:58.613 20:20:50 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:58.613 20:20:50 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:58.872 102400+0 records in 00:06:58.872 102400+0 records out 00:06:58.872 104857600 bytes (105 MB, 100 MiB) copied, 0.309493 s, 339 MB/s 00:06:58.872 20:20:51 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:58.872 20:20:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:59.438 aio_disk 00:06:59.438 20:20:51 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:59.438 20:20:51 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:59.438 20:20:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:04.697 de3e4819-85eb-48d3-84d0-490ab6bd0357 00:07:04.697 20:20:56 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:07:04.697 20:20:56 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:07:04.697 20:20:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:07:04.697 20:20:56 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:07:04.697 20:20:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:07:04.697 20:20:56 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:04.697 20:20:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:04.955 20:20:57 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:04.955 20:20:57 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:05.212 20:20:57 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:07:05.212 20:20:57 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:05.212 20:20:57 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:05.471 MallocForCryptoBdev 00:07:05.471 20:20:57 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:07:05.471 20:20:57 json_config -- json_config/json_config.sh@159 -- # wc -l 00:07:05.471 20:20:57 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:07:05.471 20:20:57 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:07:05.471 20:20:57 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:05.471 20:20:57 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:05.729 [2024-07-15 20:20:57.862780] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:07:05.729 CryptoMallocBdev 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:2477d119-3743-4083-9eb0-dd46bf03575d bdev_register:69264ed9-f7af-49cc-b300-54a863740815 bdev_register:0863a2b1-024a-4b9d-ac74-0efb7f813231 bdev_register:1c2fe8c2-60f9-40dc-8f42-f23828aad0c5 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:2477d119-3743-4083-9eb0-dd46bf03575d bdev_register:69264ed9-f7af-49cc-b300-54a863740815 bdev_register:0863a2b1-024a-4b9d-ac74-0efb7f813231 bdev_register:1c2fe8c2-60f9-40dc-8f42-f23828aad0c5 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@71 -- # sort 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@72 -- # sort 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:07:05.729 20:20:57 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:07:05.729 20:20:57 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:2477d119-3743-4083-9eb0-dd46bf03575d 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:69264ed9-f7af-49cc-b300-54a863740815 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:0863a2b1-024a-4b9d-ac74-0efb7f813231 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:1c2fe8c2-60f9-40dc-8f42-f23828aad0c5 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:05.988 20:20:58 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:05.989 20:20:58 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:0863a2b1-024a-4b9d-ac74-0efb7f813231 bdev_register:1c2fe8c2-60f9-40dc-8f42-f23828aad0c5 bdev_register:2477d119-3743-4083-9eb0-dd46bf03575d bdev_register:69264ed9-f7af-49cc-b300-54a863740815 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\8\6\3\a\2\b\1\-\0\2\4\a\-\4\b\9\d\-\a\c\7\4\-\0\e\f\b\7\f\8\1\3\2\3\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\1\c\2\f\e\8\c\2\-\6\0\f\9\-\4\0\d\c\-\8\f\4\2\-\f\2\3\8\2\8\a\a\d\0\c\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\2\4\7\7\d\1\1\9\-\3\7\4\3\-\4\0\8\3\-\9\e\b\0\-\d\d\4\6\b\f\0\3\5\7\5\d\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\6\9\2\6\4\e\d\9\-\f\7\a\f\-\4\9\c\c\-\b\3\0\0\-\5\4\a\8\6\3\7\4\0\8\1\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:07:05.989 20:20:58 json_config -- json_config/json_config.sh@86 -- # cat 00:07:05.989 20:20:58 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:0863a2b1-024a-4b9d-ac74-0efb7f813231 bdev_register:1c2fe8c2-60f9-40dc-8f42-f23828aad0c5 bdev_register:2477d119-3743-4083-9eb0-dd46bf03575d bdev_register:69264ed9-f7af-49cc-b300-54a863740815 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:07:05.989 Expected events matched: 00:07:05.989 bdev_register:0863a2b1-024a-4b9d-ac74-0efb7f813231 00:07:05.989 bdev_register:1c2fe8c2-60f9-40dc-8f42-f23828aad0c5 00:07:05.989 bdev_register:2477d119-3743-4083-9eb0-dd46bf03575d 00:07:05.989 bdev_register:69264ed9-f7af-49cc-b300-54a863740815 00:07:05.989 bdev_register:aio_disk 00:07:05.989 bdev_register:CryptoMallocBdev 00:07:05.989 bdev_register:Malloc0 00:07:05.989 bdev_register:Malloc0p0 00:07:05.989 bdev_register:Malloc0p1 00:07:05.989 bdev_register:Malloc0p2 00:07:05.989 bdev_register:Malloc1 00:07:05.989 bdev_register:Malloc3 00:07:05.989 bdev_register:MallocForCryptoBdev 00:07:05.989 bdev_register:Null0 00:07:05.989 bdev_register:Nvme0n1 00:07:05.989 bdev_register:Nvme0n1p0 00:07:05.989 bdev_register:Nvme0n1p1 00:07:05.989 bdev_register:PTBdevFromMalloc3 00:07:05.989 20:20:58 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:07:05.989 20:20:58 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:05.989 20:20:58 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:05.989 20:20:58 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:07:05.989 20:20:58 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:07:05.989 20:20:58 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:07:05.989 20:20:58 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:07:05.989 20:20:58 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:05.989 20:20:58 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:05.989 20:20:58 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:07:05.989 20:20:58 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:05.989 20:20:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:06.248 MallocBdevForConfigChangeCheck 00:07:06.248 20:20:58 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:07:06.248 20:20:58 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:06.248 20:20:58 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:06.248 20:20:58 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:07:06.248 20:20:58 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:06.506 20:20:58 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:07:06.506 INFO: shutting down applications... 00:07:06.506 20:20:58 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:07:06.506 20:20:58 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:07:06.506 20:20:58 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:07:06.506 20:20:58 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:07:06.765 [2024-07-15 20:20:59.054522] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:07:10.040 Calling clear_iscsi_subsystem 00:07:10.040 Calling clear_nvmf_subsystem 00:07:10.040 Calling clear_nbd_subsystem 00:07:10.040 Calling clear_ublk_subsystem 00:07:10.040 Calling clear_vhost_blk_subsystem 00:07:10.040 Calling clear_vhost_scsi_subsystem 00:07:10.040 Calling clear_bdev_subsystem 00:07:10.040 20:21:02 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:07:10.040 20:21:02 json_config -- json_config/json_config.sh@343 -- # count=100 00:07:10.040 20:21:02 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:07:10.040 20:21:02 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:10.040 20:21:02 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:07:10.040 20:21:02 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:07:10.040 20:21:02 json_config -- json_config/json_config.sh@345 -- # break 00:07:10.040 20:21:02 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:07:10.040 20:21:02 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:07:10.040 20:21:02 json_config -- json_config/common.sh@31 -- # local app=target 00:07:10.040 20:21:02 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:10.040 20:21:02 json_config -- json_config/common.sh@35 -- # [[ -n 1309464 ]] 00:07:10.040 20:21:02 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1309464 00:07:10.040 20:21:02 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:10.040 20:21:02 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:10.040 20:21:02 json_config -- json_config/common.sh@41 -- # kill -0 1309464 00:07:10.040 20:21:02 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:07:10.605 20:21:02 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:07:10.605 20:21:02 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:10.605 20:21:02 json_config -- json_config/common.sh@41 -- # kill -0 1309464 00:07:10.605 20:21:02 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:10.605 20:21:02 json_config -- json_config/common.sh@43 -- # break 00:07:10.605 20:21:02 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:10.605 20:21:02 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:10.605 SPDK target shutdown done 00:07:10.605 20:21:02 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:07:10.605 INFO: relaunching applications... 00:07:10.605 20:21:02 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:10.605 20:21:02 json_config -- json_config/common.sh@9 -- # local app=target 00:07:10.605 20:21:02 json_config -- json_config/common.sh@10 -- # shift 00:07:10.605 20:21:02 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:10.605 20:21:02 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:10.605 20:21:02 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:07:10.605 20:21:02 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:10.605 20:21:02 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:10.605 20:21:02 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1312380 00:07:10.605 20:21:02 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:10.605 Waiting for target to run... 00:07:10.606 20:21:02 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:10.606 20:21:02 json_config -- json_config/common.sh@25 -- # waitforlisten 1312380 /var/tmp/spdk_tgt.sock 00:07:10.606 20:21:02 json_config -- common/autotest_common.sh@829 -- # '[' -z 1312380 ']' 00:07:10.606 20:21:02 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:10.606 20:21:02 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:10.606 20:21:02 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:10.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:10.606 20:21:02 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:10.606 20:21:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:10.863 [2024-07-15 20:21:02.987442] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:10.863 [2024-07-15 20:21:02.987526] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1312380 ] 00:07:11.429 [2024-07-15 20:21:03.635753] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.429 [2024-07-15 20:21:03.736203] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.429 [2024-07-15 20:21:03.790338] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:07:11.429 [2024-07-15 20:21:03.798377] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:07:11.429 [2024-07-15 20:21:03.806403] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:07:11.686 [2024-07-15 20:21:03.887609] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:14.208 [2024-07-15 20:21:06.098535] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:14.208 [2024-07-15 20:21:06.098605] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:14.208 [2024-07-15 20:21:06.098620] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:14.208 [2024-07-15 20:21:06.106549] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:14.208 [2024-07-15 20:21:06.106577] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:14.208 [2024-07-15 20:21:06.114563] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:14.208 [2024-07-15 20:21:06.114587] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:14.208 [2024-07-15 20:21:06.122594] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:07:14.208 [2024-07-15 20:21:06.122621] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:07:14.208 [2024-07-15 20:21:06.122634] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:14.208 [2024-07-15 20:21:06.495798] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:14.208 [2024-07-15 20:21:06.495848] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:14.208 [2024-07-15 20:21:06.495865] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x207db90 00:07:14.208 [2024-07-15 20:21:06.495878] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:14.208 [2024-07-15 20:21:06.496187] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:14.208 [2024-07-15 20:21:06.496206] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:15.136 20:21:07 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:15.136 20:21:07 json_config -- common/autotest_common.sh@862 -- # return 0 00:07:15.136 20:21:07 json_config -- json_config/common.sh@26 -- # echo '' 00:07:15.136 00:07:15.136 20:21:07 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:07:15.136 20:21:07 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:07:15.136 INFO: Checking if target configuration is the same... 00:07:15.136 20:21:07 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:15.136 20:21:07 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:07:15.136 20:21:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:15.136 + '[' 2 -ne 2 ']' 00:07:15.136 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:15.136 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:15.136 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:15.136 +++ basename /dev/fd/62 00:07:15.136 ++ mktemp /tmp/62.XXX 00:07:15.136 + tmp_file_1=/tmp/62.GHF 00:07:15.136 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:15.136 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:15.137 + tmp_file_2=/tmp/spdk_tgt_config.json.Etr 00:07:15.137 + ret=0 00:07:15.137 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:15.700 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:15.957 + diff -u /tmp/62.GHF /tmp/spdk_tgt_config.json.Etr 00:07:15.957 + echo 'INFO: JSON config files are the same' 00:07:15.957 INFO: JSON config files are the same 00:07:15.957 + rm /tmp/62.GHF /tmp/spdk_tgt_config.json.Etr 00:07:15.957 + exit 0 00:07:15.957 20:21:08 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:07:15.957 20:21:08 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:07:15.957 INFO: changing configuration and checking if this can be detected... 00:07:15.957 20:21:08 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:15.957 20:21:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:16.215 20:21:08 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:07:16.215 20:21:08 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:16.215 20:21:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:16.215 + '[' 2 -ne 2 ']' 00:07:16.215 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:16.215 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:16.215 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:16.215 +++ basename /dev/fd/62 00:07:16.215 ++ mktemp /tmp/62.XXX 00:07:16.215 + tmp_file_1=/tmp/62.ESm 00:07:16.215 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:16.215 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:16.215 + tmp_file_2=/tmp/spdk_tgt_config.json.GXw 00:07:16.215 + ret=0 00:07:16.215 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:16.507 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:16.507 + diff -u /tmp/62.ESm /tmp/spdk_tgt_config.json.GXw 00:07:16.507 + ret=1 00:07:16.507 + echo '=== Start of file: /tmp/62.ESm ===' 00:07:16.507 + cat /tmp/62.ESm 00:07:16.507 + echo '=== End of file: /tmp/62.ESm ===' 00:07:16.507 + echo '' 00:07:16.507 + echo '=== Start of file: /tmp/spdk_tgt_config.json.GXw ===' 00:07:16.507 + cat /tmp/spdk_tgt_config.json.GXw 00:07:16.507 + echo '=== End of file: /tmp/spdk_tgt_config.json.GXw ===' 00:07:16.507 + echo '' 00:07:16.507 + rm /tmp/62.ESm /tmp/spdk_tgt_config.json.GXw 00:07:16.507 + exit 1 00:07:16.507 20:21:08 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:07:16.507 INFO: configuration change detected. 00:07:16.507 20:21:08 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:07:16.507 20:21:08 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:07:16.507 20:21:08 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:16.507 20:21:08 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:16.507 20:21:08 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:07:16.507 20:21:08 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:07:16.507 20:21:08 json_config -- json_config/json_config.sh@317 -- # [[ -n 1312380 ]] 00:07:16.507 20:21:08 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:07:16.507 20:21:08 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:07:16.507 20:21:08 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:16.507 20:21:08 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:16.507 20:21:08 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:07:16.507 20:21:08 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:07:16.507 20:21:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:07:16.764 20:21:09 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:07:16.764 20:21:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:07:17.021 20:21:09 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:07:17.021 20:21:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:07:17.278 20:21:09 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:07:17.278 20:21:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:07:17.536 20:21:09 json_config -- json_config/json_config.sh@193 -- # uname -s 00:07:17.536 20:21:09 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:07:17.536 20:21:09 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:07:17.536 20:21:09 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:07:17.536 20:21:09 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:17.536 20:21:09 json_config -- json_config/json_config.sh@323 -- # killprocess 1312380 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@948 -- # '[' -z 1312380 ']' 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@952 -- # kill -0 1312380 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@953 -- # uname 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1312380 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1312380' 00:07:17.536 killing process with pid 1312380 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@967 -- # kill 1312380 00:07:17.536 20:21:09 json_config -- common/autotest_common.sh@972 -- # wait 1312380 00:07:20.848 20:21:13 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:20.848 20:21:13 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:07:20.848 20:21:13 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:20.848 20:21:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:20.848 20:21:13 json_config -- json_config/json_config.sh@328 -- # return 0 00:07:20.848 20:21:13 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:07:20.848 INFO: Success 00:07:20.848 00:07:20.848 real 0m30.296s 00:07:20.848 user 0m37.293s 00:07:20.848 sys 0m4.479s 00:07:20.848 20:21:13 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:20.848 20:21:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:20.848 ************************************ 00:07:20.848 END TEST json_config 00:07:20.848 ************************************ 00:07:20.848 20:21:13 -- common/autotest_common.sh@1142 -- # return 0 00:07:20.848 20:21:13 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:20.848 20:21:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:20.848 20:21:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:20.848 20:21:13 -- common/autotest_common.sh@10 -- # set +x 00:07:20.848 ************************************ 00:07:20.848 START TEST json_config_extra_key 00:07:20.848 ************************************ 00:07:20.848 20:21:13 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:21.105 20:21:13 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:21.105 20:21:13 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:21.105 20:21:13 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:21.105 20:21:13 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:21.105 20:21:13 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:21.105 20:21:13 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:21.105 20:21:13 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:21.105 20:21:13 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:21.105 20:21:13 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:21.105 INFO: launching applications... 00:07:21.105 20:21:13 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1314131 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:21.105 Waiting for target to run... 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1314131 /var/tmp/spdk_tgt.sock 00:07:21.105 20:21:13 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 1314131 ']' 00:07:21.105 20:21:13 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:21.105 20:21:13 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:21.105 20:21:13 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:21.105 20:21:13 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:21.105 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:21.106 20:21:13 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:21.106 20:21:13 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:21.106 [2024-07-15 20:21:13.373308] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:21.106 [2024-07-15 20:21:13.373387] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1314131 ] 00:07:21.669 [2024-07-15 20:21:13.763054] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.669 [2024-07-15 20:21:13.854450] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.926 20:21:14 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:21.926 20:21:14 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:07:21.926 20:21:14 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:21.926 00:07:21.926 20:21:14 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:21.926 INFO: shutting down applications... 00:07:21.926 20:21:14 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:21.926 20:21:14 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:21.926 20:21:14 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:21.926 20:21:14 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1314131 ]] 00:07:21.926 20:21:14 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1314131 00:07:21.926 20:21:14 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:21.926 20:21:14 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:21.926 20:21:14 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1314131 00:07:21.926 20:21:14 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:22.493 20:21:14 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:22.493 20:21:14 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:22.493 20:21:14 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1314131 00:07:22.494 20:21:14 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:22.494 20:21:14 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:22.494 20:21:14 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:22.494 20:21:14 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:22.494 SPDK target shutdown done 00:07:22.494 20:21:14 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:22.494 Success 00:07:22.494 00:07:22.494 real 0m1.615s 00:07:22.494 user 0m1.268s 00:07:22.494 sys 0m0.546s 00:07:22.494 20:21:14 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:22.494 20:21:14 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:22.494 ************************************ 00:07:22.494 END TEST json_config_extra_key 00:07:22.494 ************************************ 00:07:22.494 20:21:14 -- common/autotest_common.sh@1142 -- # return 0 00:07:22.494 20:21:14 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:22.494 20:21:14 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:22.494 20:21:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.494 20:21:14 -- common/autotest_common.sh@10 -- # set +x 00:07:22.752 ************************************ 00:07:22.752 START TEST alias_rpc 00:07:22.752 ************************************ 00:07:22.752 20:21:14 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:22.752 * Looking for test storage... 00:07:22.752 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:07:22.752 20:21:14 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:22.752 20:21:14 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1314513 00:07:22.752 20:21:14 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1314513 00:07:22.752 20:21:14 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:22.752 20:21:14 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 1314513 ']' 00:07:22.752 20:21:14 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:22.752 20:21:14 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:22.752 20:21:14 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:22.752 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:22.752 20:21:14 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:22.752 20:21:14 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:22.752 [2024-07-15 20:21:15.067371] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:22.752 [2024-07-15 20:21:15.067447] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1314513 ] 00:07:23.009 [2024-07-15 20:21:15.195441] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.009 [2024-07-15 20:21:15.297719] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.941 20:21:15 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:23.941 20:21:15 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:23.941 20:21:15 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:23.942 20:21:16 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1314513 00:07:23.942 20:21:16 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 1314513 ']' 00:07:23.942 20:21:16 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 1314513 00:07:23.942 20:21:16 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:07:23.942 20:21:16 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:23.942 20:21:16 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1314513 00:07:23.942 20:21:16 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:23.942 20:21:16 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:23.942 20:21:16 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1314513' 00:07:23.942 killing process with pid 1314513 00:07:23.942 20:21:16 alias_rpc -- common/autotest_common.sh@967 -- # kill 1314513 00:07:23.942 20:21:16 alias_rpc -- common/autotest_common.sh@972 -- # wait 1314513 00:07:24.507 00:07:24.507 real 0m1.776s 00:07:24.507 user 0m1.938s 00:07:24.507 sys 0m0.580s 00:07:24.507 20:21:16 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:24.507 20:21:16 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:24.507 ************************************ 00:07:24.507 END TEST alias_rpc 00:07:24.507 ************************************ 00:07:24.507 20:21:16 -- common/autotest_common.sh@1142 -- # return 0 00:07:24.507 20:21:16 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:24.507 20:21:16 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:24.507 20:21:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:24.507 20:21:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:24.507 20:21:16 -- common/autotest_common.sh@10 -- # set +x 00:07:24.507 ************************************ 00:07:24.507 START TEST spdkcli_tcp 00:07:24.507 ************************************ 00:07:24.507 20:21:16 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:24.507 * Looking for test storage... 00:07:24.507 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:24.507 20:21:16 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:24.507 20:21:16 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:24.507 20:21:16 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:24.507 20:21:16 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:24.507 20:21:16 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:24.507 20:21:16 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:24.507 20:21:16 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:24.507 20:21:16 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:24.507 20:21:16 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:24.507 20:21:16 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1314752 00:07:24.507 20:21:16 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1314752 00:07:24.507 20:21:16 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:24.508 20:21:16 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 1314752 ']' 00:07:24.508 20:21:16 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:24.508 20:21:16 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:24.508 20:21:16 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:24.508 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:24.508 20:21:16 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:24.508 20:21:16 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:24.766 [2024-07-15 20:21:16.931768] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:24.766 [2024-07-15 20:21:16.931843] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1314752 ] 00:07:24.766 [2024-07-15 20:21:17.062379] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:25.024 [2024-07-15 20:21:17.171069] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.024 [2024-07-15 20:21:17.171075] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.592 20:21:17 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:25.592 20:21:17 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:07:25.592 20:21:17 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1314929 00:07:25.592 20:21:17 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:25.592 20:21:17 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:25.851 [ 00:07:25.851 "bdev_malloc_delete", 00:07:25.851 "bdev_malloc_create", 00:07:25.851 "bdev_null_resize", 00:07:25.851 "bdev_null_delete", 00:07:25.851 "bdev_null_create", 00:07:25.851 "bdev_nvme_cuse_unregister", 00:07:25.851 "bdev_nvme_cuse_register", 00:07:25.851 "bdev_opal_new_user", 00:07:25.851 "bdev_opal_set_lock_state", 00:07:25.851 "bdev_opal_delete", 00:07:25.851 "bdev_opal_get_info", 00:07:25.851 "bdev_opal_create", 00:07:25.851 "bdev_nvme_opal_revert", 00:07:25.851 "bdev_nvme_opal_init", 00:07:25.851 "bdev_nvme_send_cmd", 00:07:25.851 "bdev_nvme_get_path_iostat", 00:07:25.851 "bdev_nvme_get_mdns_discovery_info", 00:07:25.851 "bdev_nvme_stop_mdns_discovery", 00:07:25.851 "bdev_nvme_start_mdns_discovery", 00:07:25.851 "bdev_nvme_set_multipath_policy", 00:07:25.851 "bdev_nvme_set_preferred_path", 00:07:25.851 "bdev_nvme_get_io_paths", 00:07:25.851 "bdev_nvme_remove_error_injection", 00:07:25.851 "bdev_nvme_add_error_injection", 00:07:25.851 "bdev_nvme_get_discovery_info", 00:07:25.851 "bdev_nvme_stop_discovery", 00:07:25.851 "bdev_nvme_start_discovery", 00:07:25.851 "bdev_nvme_get_controller_health_info", 00:07:25.851 "bdev_nvme_disable_controller", 00:07:25.851 "bdev_nvme_enable_controller", 00:07:25.851 "bdev_nvme_reset_controller", 00:07:25.851 "bdev_nvme_get_transport_statistics", 00:07:25.851 "bdev_nvme_apply_firmware", 00:07:25.851 "bdev_nvme_detach_controller", 00:07:25.851 "bdev_nvme_get_controllers", 00:07:25.851 "bdev_nvme_attach_controller", 00:07:25.851 "bdev_nvme_set_hotplug", 00:07:25.851 "bdev_nvme_set_options", 00:07:25.851 "bdev_passthru_delete", 00:07:25.851 "bdev_passthru_create", 00:07:25.851 "bdev_lvol_set_parent_bdev", 00:07:25.851 "bdev_lvol_set_parent", 00:07:25.851 "bdev_lvol_check_shallow_copy", 00:07:25.851 "bdev_lvol_start_shallow_copy", 00:07:25.851 "bdev_lvol_grow_lvstore", 00:07:25.851 "bdev_lvol_get_lvols", 00:07:25.851 "bdev_lvol_get_lvstores", 00:07:25.851 "bdev_lvol_delete", 00:07:25.851 "bdev_lvol_set_read_only", 00:07:25.851 "bdev_lvol_resize", 00:07:25.851 "bdev_lvol_decouple_parent", 00:07:25.851 "bdev_lvol_inflate", 00:07:25.851 "bdev_lvol_rename", 00:07:25.851 "bdev_lvol_clone_bdev", 00:07:25.851 "bdev_lvol_clone", 00:07:25.851 "bdev_lvol_snapshot", 00:07:25.851 "bdev_lvol_create", 00:07:25.851 "bdev_lvol_delete_lvstore", 00:07:25.851 "bdev_lvol_rename_lvstore", 00:07:25.851 "bdev_lvol_create_lvstore", 00:07:25.851 "bdev_raid_set_options", 00:07:25.851 "bdev_raid_remove_base_bdev", 00:07:25.851 "bdev_raid_add_base_bdev", 00:07:25.851 "bdev_raid_delete", 00:07:25.851 "bdev_raid_create", 00:07:25.851 "bdev_raid_get_bdevs", 00:07:25.851 "bdev_error_inject_error", 00:07:25.851 "bdev_error_delete", 00:07:25.851 "bdev_error_create", 00:07:25.851 "bdev_split_delete", 00:07:25.851 "bdev_split_create", 00:07:25.851 "bdev_delay_delete", 00:07:25.851 "bdev_delay_create", 00:07:25.851 "bdev_delay_update_latency", 00:07:25.851 "bdev_zone_block_delete", 00:07:25.851 "bdev_zone_block_create", 00:07:25.851 "blobfs_create", 00:07:25.851 "blobfs_detect", 00:07:25.851 "blobfs_set_cache_size", 00:07:25.851 "bdev_crypto_delete", 00:07:25.851 "bdev_crypto_create", 00:07:25.851 "bdev_compress_delete", 00:07:25.851 "bdev_compress_create", 00:07:25.851 "bdev_compress_get_orphans", 00:07:25.851 "bdev_aio_delete", 00:07:25.851 "bdev_aio_rescan", 00:07:25.851 "bdev_aio_create", 00:07:25.851 "bdev_ftl_set_property", 00:07:25.851 "bdev_ftl_get_properties", 00:07:25.851 "bdev_ftl_get_stats", 00:07:25.851 "bdev_ftl_unmap", 00:07:25.851 "bdev_ftl_unload", 00:07:25.851 "bdev_ftl_delete", 00:07:25.851 "bdev_ftl_load", 00:07:25.851 "bdev_ftl_create", 00:07:25.851 "bdev_virtio_attach_controller", 00:07:25.851 "bdev_virtio_scsi_get_devices", 00:07:25.851 "bdev_virtio_detach_controller", 00:07:25.851 "bdev_virtio_blk_set_hotplug", 00:07:25.851 "bdev_iscsi_delete", 00:07:25.851 "bdev_iscsi_create", 00:07:25.851 "bdev_iscsi_set_options", 00:07:25.851 "accel_error_inject_error", 00:07:25.851 "ioat_scan_accel_module", 00:07:25.851 "dsa_scan_accel_module", 00:07:25.851 "iaa_scan_accel_module", 00:07:25.851 "dpdk_cryptodev_get_driver", 00:07:25.851 "dpdk_cryptodev_set_driver", 00:07:25.851 "dpdk_cryptodev_scan_accel_module", 00:07:25.851 "compressdev_scan_accel_module", 00:07:25.851 "keyring_file_remove_key", 00:07:25.851 "keyring_file_add_key", 00:07:25.851 "keyring_linux_set_options", 00:07:25.851 "iscsi_get_histogram", 00:07:25.851 "iscsi_enable_histogram", 00:07:25.851 "iscsi_set_options", 00:07:25.851 "iscsi_get_auth_groups", 00:07:25.851 "iscsi_auth_group_remove_secret", 00:07:25.851 "iscsi_auth_group_add_secret", 00:07:25.851 "iscsi_delete_auth_group", 00:07:25.851 "iscsi_create_auth_group", 00:07:25.851 "iscsi_set_discovery_auth", 00:07:25.851 "iscsi_get_options", 00:07:25.851 "iscsi_target_node_request_logout", 00:07:25.851 "iscsi_target_node_set_redirect", 00:07:25.851 "iscsi_target_node_set_auth", 00:07:25.851 "iscsi_target_node_add_lun", 00:07:25.851 "iscsi_get_stats", 00:07:25.851 "iscsi_get_connections", 00:07:25.851 "iscsi_portal_group_set_auth", 00:07:25.851 "iscsi_start_portal_group", 00:07:25.851 "iscsi_delete_portal_group", 00:07:25.851 "iscsi_create_portal_group", 00:07:25.851 "iscsi_get_portal_groups", 00:07:25.851 "iscsi_delete_target_node", 00:07:25.851 "iscsi_target_node_remove_pg_ig_maps", 00:07:25.851 "iscsi_target_node_add_pg_ig_maps", 00:07:25.851 "iscsi_create_target_node", 00:07:25.851 "iscsi_get_target_nodes", 00:07:25.851 "iscsi_delete_initiator_group", 00:07:25.851 "iscsi_initiator_group_remove_initiators", 00:07:25.851 "iscsi_initiator_group_add_initiators", 00:07:25.851 "iscsi_create_initiator_group", 00:07:25.851 "iscsi_get_initiator_groups", 00:07:25.851 "nvmf_set_crdt", 00:07:25.851 "nvmf_set_config", 00:07:25.851 "nvmf_set_max_subsystems", 00:07:25.851 "nvmf_stop_mdns_prr", 00:07:25.851 "nvmf_publish_mdns_prr", 00:07:25.851 "nvmf_subsystem_get_listeners", 00:07:25.851 "nvmf_subsystem_get_qpairs", 00:07:25.851 "nvmf_subsystem_get_controllers", 00:07:25.851 "nvmf_get_stats", 00:07:25.851 "nvmf_get_transports", 00:07:25.851 "nvmf_create_transport", 00:07:25.851 "nvmf_get_targets", 00:07:25.851 "nvmf_delete_target", 00:07:25.851 "nvmf_create_target", 00:07:25.851 "nvmf_subsystem_allow_any_host", 00:07:25.851 "nvmf_subsystem_remove_host", 00:07:25.851 "nvmf_subsystem_add_host", 00:07:25.851 "nvmf_ns_remove_host", 00:07:25.851 "nvmf_ns_add_host", 00:07:25.851 "nvmf_subsystem_remove_ns", 00:07:25.851 "nvmf_subsystem_add_ns", 00:07:25.851 "nvmf_subsystem_listener_set_ana_state", 00:07:25.851 "nvmf_discovery_get_referrals", 00:07:25.851 "nvmf_discovery_remove_referral", 00:07:25.851 "nvmf_discovery_add_referral", 00:07:25.851 "nvmf_subsystem_remove_listener", 00:07:25.851 "nvmf_subsystem_add_listener", 00:07:25.851 "nvmf_delete_subsystem", 00:07:25.851 "nvmf_create_subsystem", 00:07:25.851 "nvmf_get_subsystems", 00:07:25.851 "env_dpdk_get_mem_stats", 00:07:25.851 "nbd_get_disks", 00:07:25.851 "nbd_stop_disk", 00:07:25.851 "nbd_start_disk", 00:07:25.851 "ublk_recover_disk", 00:07:25.851 "ublk_get_disks", 00:07:25.851 "ublk_stop_disk", 00:07:25.851 "ublk_start_disk", 00:07:25.851 "ublk_destroy_target", 00:07:25.851 "ublk_create_target", 00:07:25.851 "virtio_blk_create_transport", 00:07:25.851 "virtio_blk_get_transports", 00:07:25.851 "vhost_controller_set_coalescing", 00:07:25.851 "vhost_get_controllers", 00:07:25.851 "vhost_delete_controller", 00:07:25.851 "vhost_create_blk_controller", 00:07:25.851 "vhost_scsi_controller_remove_target", 00:07:25.851 "vhost_scsi_controller_add_target", 00:07:25.851 "vhost_start_scsi_controller", 00:07:25.851 "vhost_create_scsi_controller", 00:07:25.851 "thread_set_cpumask", 00:07:25.851 "framework_get_governor", 00:07:25.852 "framework_get_scheduler", 00:07:25.852 "framework_set_scheduler", 00:07:25.852 "framework_get_reactors", 00:07:25.852 "thread_get_io_channels", 00:07:25.852 "thread_get_pollers", 00:07:25.852 "thread_get_stats", 00:07:25.852 "framework_monitor_context_switch", 00:07:25.852 "spdk_kill_instance", 00:07:25.852 "log_enable_timestamps", 00:07:25.852 "log_get_flags", 00:07:25.852 "log_clear_flag", 00:07:25.852 "log_set_flag", 00:07:25.852 "log_get_level", 00:07:25.852 "log_set_level", 00:07:25.852 "log_get_print_level", 00:07:25.852 "log_set_print_level", 00:07:25.852 "framework_enable_cpumask_locks", 00:07:25.852 "framework_disable_cpumask_locks", 00:07:25.852 "framework_wait_init", 00:07:25.852 "framework_start_init", 00:07:25.852 "scsi_get_devices", 00:07:25.852 "bdev_get_histogram", 00:07:25.852 "bdev_enable_histogram", 00:07:25.852 "bdev_set_qos_limit", 00:07:25.852 "bdev_set_qd_sampling_period", 00:07:25.852 "bdev_get_bdevs", 00:07:25.852 "bdev_reset_iostat", 00:07:25.852 "bdev_get_iostat", 00:07:25.852 "bdev_examine", 00:07:25.852 "bdev_wait_for_examine", 00:07:25.852 "bdev_set_options", 00:07:25.852 "notify_get_notifications", 00:07:25.852 "notify_get_types", 00:07:25.852 "accel_get_stats", 00:07:25.852 "accel_set_options", 00:07:25.852 "accel_set_driver", 00:07:25.852 "accel_crypto_key_destroy", 00:07:25.852 "accel_crypto_keys_get", 00:07:25.852 "accel_crypto_key_create", 00:07:25.852 "accel_assign_opc", 00:07:25.852 "accel_get_module_info", 00:07:25.852 "accel_get_opc_assignments", 00:07:25.852 "vmd_rescan", 00:07:25.852 "vmd_remove_device", 00:07:25.852 "vmd_enable", 00:07:25.852 "sock_get_default_impl", 00:07:25.852 "sock_set_default_impl", 00:07:25.852 "sock_impl_set_options", 00:07:25.852 "sock_impl_get_options", 00:07:25.852 "iobuf_get_stats", 00:07:25.852 "iobuf_set_options", 00:07:25.852 "framework_get_pci_devices", 00:07:25.852 "framework_get_config", 00:07:25.852 "framework_get_subsystems", 00:07:25.852 "trace_get_info", 00:07:25.852 "trace_get_tpoint_group_mask", 00:07:25.852 "trace_disable_tpoint_group", 00:07:25.852 "trace_enable_tpoint_group", 00:07:25.852 "trace_clear_tpoint_mask", 00:07:25.852 "trace_set_tpoint_mask", 00:07:25.852 "keyring_get_keys", 00:07:25.852 "spdk_get_version", 00:07:25.852 "rpc_get_methods" 00:07:25.852 ] 00:07:25.852 20:21:18 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:25.852 20:21:18 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:25.852 20:21:18 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1314752 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 1314752 ']' 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 1314752 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1314752 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1314752' 00:07:25.852 killing process with pid 1314752 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 1314752 00:07:25.852 20:21:18 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 1314752 00:07:26.419 00:07:26.419 real 0m1.883s 00:07:26.419 user 0m3.439s 00:07:26.419 sys 0m0.619s 00:07:26.419 20:21:18 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:26.419 20:21:18 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:26.419 ************************************ 00:07:26.419 END TEST spdkcli_tcp 00:07:26.419 ************************************ 00:07:26.419 20:21:18 -- common/autotest_common.sh@1142 -- # return 0 00:07:26.419 20:21:18 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:26.419 20:21:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:26.419 20:21:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.419 20:21:18 -- common/autotest_common.sh@10 -- # set +x 00:07:26.419 ************************************ 00:07:26.419 START TEST dpdk_mem_utility 00:07:26.419 ************************************ 00:07:26.419 20:21:18 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:26.678 * Looking for test storage... 00:07:26.678 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:26.678 20:21:18 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:26.678 20:21:18 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1315065 00:07:26.678 20:21:18 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1315065 00:07:26.678 20:21:18 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:26.678 20:21:18 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 1315065 ']' 00:07:26.678 20:21:18 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:26.678 20:21:18 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:26.678 20:21:18 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:26.678 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:26.678 20:21:18 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:26.678 20:21:18 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:26.678 [2024-07-15 20:21:18.889969] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:26.678 [2024-07-15 20:21:18.890049] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1315065 ] 00:07:26.678 [2024-07-15 20:21:19.018772] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.937 [2024-07-15 20:21:19.122248] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.502 20:21:19 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:27.502 20:21:19 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:07:27.502 20:21:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:27.502 20:21:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:27.502 20:21:19 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:27.502 20:21:19 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:27.502 { 00:07:27.502 "filename": "/tmp/spdk_mem_dump.txt" 00:07:27.502 } 00:07:27.502 20:21:19 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:27.502 20:21:19 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:27.761 DPDK memory size 816.000000 MiB in 2 heap(s) 00:07:27.761 2 heaps totaling size 816.000000 MiB 00:07:27.761 size: 814.000000 MiB heap id: 0 00:07:27.761 size: 2.000000 MiB heap id: 1 00:07:27.761 end heaps---------- 00:07:27.761 8 mempools totaling size 598.116089 MiB 00:07:27.761 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:27.761 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:27.761 size: 84.521057 MiB name: bdev_io_1315065 00:07:27.761 size: 51.011292 MiB name: evtpool_1315065 00:07:27.761 size: 50.003479 MiB name: msgpool_1315065 00:07:27.761 size: 21.763794 MiB name: PDU_Pool 00:07:27.761 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:27.761 size: 0.026123 MiB name: Session_Pool 00:07:27.761 end mempools------- 00:07:27.761 201 memzones totaling size 4.176453 MiB 00:07:27.761 size: 1.000366 MiB name: RG_ring_0_1315065 00:07:27.761 size: 1.000366 MiB name: RG_ring_1_1315065 00:07:27.761 size: 1.000366 MiB name: RG_ring_4_1315065 00:07:27.761 size: 1.000366 MiB name: RG_ring_5_1315065 00:07:27.761 size: 0.125366 MiB name: RG_ring_2_1315065 00:07:27.761 size: 0.015991 MiB name: RG_ring_3_1315065 00:07:27.761 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:27.761 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:27.761 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:27.761 size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:27.761 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:27.761 size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_0 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_1 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_2 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_3 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_4 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_5 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_6 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_7 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_8 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_9 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_10 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_11 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_12 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_13 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_14 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_15 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_16 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_17 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_18 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_19 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_20 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_21 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_22 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_23 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_24 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_25 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_26 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_27 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_28 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_29 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_30 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_31 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_64 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_65 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_32 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_66 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_67 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_33 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_68 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_69 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_34 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_70 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_71 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_35 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_72 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_73 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_36 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_74 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_75 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_37 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_76 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_77 00:07:27.762 size: 0.000122 MiB name: rte_compressdev_data_38 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_78 00:07:27.762 size: 0.000122 MiB name: rte_cryptodev_data_79 00:07:27.763 size: 0.000122 MiB name: rte_compressdev_data_39 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_80 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_81 00:07:27.763 size: 0.000122 MiB name: rte_compressdev_data_40 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_82 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_83 00:07:27.763 size: 0.000122 MiB name: rte_compressdev_data_41 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_84 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_85 00:07:27.763 size: 0.000122 MiB name: rte_compressdev_data_42 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_86 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_87 00:07:27.763 size: 0.000122 MiB name: rte_compressdev_data_43 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_88 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_89 00:07:27.763 size: 0.000122 MiB name: rte_compressdev_data_44 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_90 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_91 00:07:27.763 size: 0.000122 MiB name: rte_compressdev_data_45 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_92 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_93 00:07:27.763 size: 0.000122 MiB name: rte_compressdev_data_46 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_94 00:07:27.763 size: 0.000122 MiB name: rte_cryptodev_data_95 00:07:27.763 size: 0.000122 MiB name: rte_compressdev_data_47 00:07:27.763 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:27.763 end memzones------- 00:07:27.763 20:21:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:28.026 heap id: 0 total size: 814.000000 MiB number of busy elements: 523 number of free elements: 14 00:07:28.026 list of free elements. size: 11.814087 MiB 00:07:28.026 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:28.026 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:28.026 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:28.026 element at address: 0x200003e00000 with size: 0.996460 MiB 00:07:28.026 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:28.026 element at address: 0x200013800000 with size: 0.978882 MiB 00:07:28.026 element at address: 0x200007000000 with size: 0.959839 MiB 00:07:28.026 element at address: 0x200019200000 with size: 0.937256 MiB 00:07:28.026 element at address: 0x20001aa00000 with size: 0.582703 MiB 00:07:28.026 element at address: 0x200003a00000 with size: 0.498535 MiB 00:07:28.026 element at address: 0x20000b200000 with size: 0.491272 MiB 00:07:28.026 element at address: 0x200000800000 with size: 0.487061 MiB 00:07:28.026 element at address: 0x200019400000 with size: 0.485840 MiB 00:07:28.026 element at address: 0x200027e00000 with size: 0.402527 MiB 00:07:28.026 list of standard malloc elements. size: 199.877625 MiB 00:07:28.026 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:28.026 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:28.026 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:28.026 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:28.026 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:28.026 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:28.026 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:28.026 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:28.026 element at address: 0x200000330b40 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000337640 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000033e140 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000344c40 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000034b740 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000352240 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000358d40 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000035f840 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000366880 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000036a340 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000036de00 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000375380 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000378e40 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000037c900 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000383e80 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000387940 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000038b400 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000392980 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000396440 with size: 0.004395 MiB 00:07:28.026 element at address: 0x200000399f00 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:07:28.026 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:07:28.026 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000333040 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000335540 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000339b40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000033c040 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000340640 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000342b40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000347140 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000349640 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000350140 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000354740 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000356c40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000035b240 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000035d740 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000361d40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000364780 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000365800 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000368240 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000370840 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000373280 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000374300 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000376d40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000037a800 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000037b880 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000037f340 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000381d80 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000382e00 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000385840 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000389300 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000038a380 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000038de40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000390880 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000391900 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000394340 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000397e00 with size: 0.004028 MiB 00:07:28.026 element at address: 0x200000398e80 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000039c940 with size: 0.004028 MiB 00:07:28.026 element at address: 0x20000039f380 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:07:28.026 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:07:28.027 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:07:28.027 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:07:28.027 element at address: 0x200000204e00 with size: 0.000305 MiB 00:07:28.027 element at address: 0x200000200000 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200180 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200240 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200300 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200480 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200540 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200600 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200780 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200840 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200900 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200a80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200b40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200c00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200d80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200e40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200f00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201080 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201140 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201200 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201380 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201440 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201500 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201680 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201740 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201800 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201980 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201a40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201b00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201c80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201d40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201e00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000201f80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202040 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202100 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202280 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202340 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202400 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202580 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202640 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202700 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202880 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202940 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202a00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202b80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202c40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202d00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202e80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000202f40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203000 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203180 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203240 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203300 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203480 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203540 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203600 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203780 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203840 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203900 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203a80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203b40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203c00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203d80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203e40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203f00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204080 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204140 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204200 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204380 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204440 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204500 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204680 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204740 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204800 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204980 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204a40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204b00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204c80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204d40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000204f40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205000 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205180 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205240 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205300 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205480 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205540 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205600 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205780 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205840 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205900 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205a80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205b40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205c00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205d80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205e40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205f00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000206080 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000206140 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000206200 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000020a780 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022af80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b040 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b100 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b280 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b340 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b400 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b580 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b640 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b700 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b7c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022be40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022c080 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022c140 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022c200 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022c380 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022c440 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000022c500 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000032e700 with size: 0.000183 MiB 00:07:28.027 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000331d40 with size: 0.000183 MiB 00:07:28.027 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:07:28.027 element at address: 0x200000338840 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000033f340 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000345e40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000034c940 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000353440 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000359f40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000360a40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000364180 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000364240 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000364400 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000367a80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000367c40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000367d00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000036b540 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000036b700 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000036b980 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000036f000 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000036f280 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000036f440 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000372c80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000372d40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000372f00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000376580 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000376740 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000376800 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000037a040 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000037a200 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000037a480 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000037db00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000037df40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000381780 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000381840 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000381a00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000385080 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000385240 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000385300 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000388b40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000388d00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000388f80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000038c600 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000038c880 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000390280 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000390340 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000390500 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000393b80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000393d40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000393e00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000397640 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000397800 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200000397a80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000039b100 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000039b380 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000039b540 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000039f000 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:28.028 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:07:28.028 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e670c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e67180 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6dd80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:07:28.028 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:28.029 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:28.029 list of memzone associated elements. size: 602.308289 MiB 00:07:28.029 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:28.029 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:28.029 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:28.029 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:28.029 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:28.029 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1315065_0 00:07:28.029 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:28.029 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1315065_0 00:07:28.029 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:28.029 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1315065_0 00:07:28.029 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:28.029 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:28.029 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:28.029 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:28.029 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:28.029 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1315065 00:07:28.029 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:28.029 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1315065 00:07:28.029 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:07:28.029 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1315065 00:07:28.029 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:28.029 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:28.029 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:28.029 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:28.029 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:28.029 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:28.029 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:28.029 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:28.029 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:28.029 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1315065 00:07:28.029 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:28.029 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1315065 00:07:28.029 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:28.029 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1315065 00:07:28.029 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:28.029 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1315065 00:07:28.029 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:07:28.029 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1315065 00:07:28.029 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:07:28.029 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:28.029 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:28.029 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:28.029 element at address: 0x20001947c600 with size: 0.250488 MiB 00:07:28.029 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:28.029 element at address: 0x20000020a840 with size: 0.125488 MiB 00:07:28.029 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1315065 00:07:28.029 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:07:28.029 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:28.029 element at address: 0x200027e67240 with size: 0.023743 MiB 00:07:28.029 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:28.029 element at address: 0x200000206580 with size: 0.016113 MiB 00:07:28.029 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1315065 00:07:28.029 element at address: 0x200027e6d380 with size: 0.002441 MiB 00:07:28.029 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:28.029 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:07:28.029 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:28.029 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:28.029 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:28.029 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:28.029 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:28.029 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:28.029 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:28.029 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:28.029 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:28.029 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:28.029 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:28.029 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:28.029 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:28.029 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:28.029 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:28.029 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:28.029 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:28.029 element at address: 0x20000039b700 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:28.029 element at address: 0x200000397c40 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:28.029 element at address: 0x200000394180 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:28.029 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:28.029 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:28.029 element at address: 0x200000389140 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:28.029 element at address: 0x200000385680 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:28.029 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:28.029 element at address: 0x20000037e100 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:28.029 element at address: 0x20000037a640 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:28.029 element at address: 0x200000376b80 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:28.029 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:28.029 element at address: 0x20000036f600 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:28.029 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:28.029 element at address: 0x200000368080 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:28.029 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:28.029 element at address: 0x200000360b00 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:28.029 element at address: 0x20000035d580 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:28.029 element at address: 0x20000035a000 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:28.029 element at address: 0x200000356a80 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:28.029 element at address: 0x200000353500 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:28.029 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:28.029 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:28.029 element at address: 0x200000349480 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:28.029 element at address: 0x200000345f00 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:28.029 element at address: 0x200000342980 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:28.029 element at address: 0x20000033f400 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:28.029 element at address: 0x20000033be80 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:28.029 element at address: 0x200000338900 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:28.029 element at address: 0x200000335380 with size: 0.000427 MiB 00:07:28.029 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:28.030 element at address: 0x200000331e00 with size: 0.000427 MiB 00:07:28.030 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:28.030 element at address: 0x20000032e880 with size: 0.000427 MiB 00:07:28.030 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:28.030 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:07:28.030 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:28.030 element at address: 0x20000022b880 with size: 0.000305 MiB 00:07:28.030 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1315065 00:07:28.030 element at address: 0x200000206380 with size: 0.000305 MiB 00:07:28.030 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1315065 00:07:28.030 element at address: 0x200027e6de40 with size: 0.000305 MiB 00:07:28.030 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:28.030 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:28.030 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:28.030 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:07:28.030 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:28.030 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:28.030 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:07:28.030 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:28.030 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:28.030 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:07:28.030 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:28.030 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:28.030 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:07:28.030 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:28.030 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:28.030 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:07:28.030 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:28.030 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:28.030 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:07:28.030 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:28.030 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:28.030 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:07:28.030 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:28.030 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:28.030 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:07:28.030 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:28.030 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:28.030 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:07:28.030 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:28.030 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:28.030 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:07:28.030 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:28.030 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:28.030 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:07:28.030 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:28.030 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:28.030 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:07:28.030 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:28.030 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:28.030 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:07:28.030 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:28.030 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:28.030 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:07:28.030 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:28.030 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:28.030 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:07:28.030 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:28.030 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:28.030 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:07:28.030 element at address: 0x20000039b600 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:28.030 element at address: 0x20000039b440 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:28.030 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:07:28.030 element at address: 0x200000397b40 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:28.030 element at address: 0x200000397980 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:28.030 element at address: 0x200000397700 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:07:28.030 element at address: 0x200000394080 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:28.030 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:28.030 element at address: 0x200000393c40 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:07:28.030 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:28.030 element at address: 0x200000390400 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:28.030 element at address: 0x200000390180 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:07:28.030 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:28.030 element at address: 0x20000038c940 with size: 0.000244 MiB 00:07:28.030 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:28.031 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:07:28.031 element at address: 0x200000389040 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:28.031 element at address: 0x200000388e80 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:28.031 element at address: 0x200000388c00 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:07:28.031 element at address: 0x200000385580 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:28.031 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:28.031 element at address: 0x200000385140 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:07:28.031 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:28.031 element at address: 0x200000381900 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:28.031 element at address: 0x200000381680 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:07:28.031 element at address: 0x20000037e000 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:28.031 element at address: 0x20000037de40 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:28.031 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:07:28.031 element at address: 0x20000037a540 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:28.031 element at address: 0x20000037a380 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:28.031 element at address: 0x20000037a100 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:07:28.031 element at address: 0x200000376a80 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:28.031 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:28.031 element at address: 0x200000376640 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:07:28.031 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:28.031 element at address: 0x200000372e00 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:28.031 element at address: 0x200000372b80 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:07:28.031 element at address: 0x20000036f500 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:28.031 element at address: 0x20000036f340 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:28.031 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:07:28.031 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:28.031 element at address: 0x20000036b880 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:28.031 element at address: 0x20000036b600 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:07:28.031 element at address: 0x200000367f80 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:28.031 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:28.031 element at address: 0x200000367b40 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:07:28.031 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:28.031 element at address: 0x200000364300 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:28.031 element at address: 0x200000364080 with size: 0.000244 MiB 00:07:28.031 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:07:28.031 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:07:28.031 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:28.031 20:21:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:28.031 20:21:20 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1315065 00:07:28.031 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 1315065 ']' 00:07:28.031 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 1315065 00:07:28.031 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:07:28.031 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:28.031 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1315065 00:07:28.031 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:28.031 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:28.031 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1315065' 00:07:28.031 killing process with pid 1315065 00:07:28.031 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 1315065 00:07:28.031 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 1315065 00:07:28.597 00:07:28.597 real 0m2.021s 00:07:28.597 user 0m2.406s 00:07:28.597 sys 0m0.583s 00:07:28.597 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:28.597 20:21:20 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:28.597 ************************************ 00:07:28.597 END TEST dpdk_mem_utility 00:07:28.597 ************************************ 00:07:28.597 20:21:20 -- common/autotest_common.sh@1142 -- # return 0 00:07:28.597 20:21:20 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:28.597 20:21:20 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:28.597 20:21:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.597 20:21:20 -- common/autotest_common.sh@10 -- # set +x 00:07:28.597 ************************************ 00:07:28.597 START TEST event 00:07:28.597 ************************************ 00:07:28.597 20:21:20 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:28.597 * Looking for test storage... 00:07:28.597 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:07:28.597 20:21:20 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:28.597 20:21:20 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:28.598 20:21:20 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:28.598 20:21:20 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:28.598 20:21:20 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:28.598 20:21:20 event -- common/autotest_common.sh@10 -- # set +x 00:07:28.598 ************************************ 00:07:28.598 START TEST event_perf 00:07:28.598 ************************************ 00:07:28.598 20:21:20 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:28.856 Running I/O for 1 seconds...[2024-07-15 20:21:20.982365] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:28.856 [2024-07-15 20:21:20.982431] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1315406 ] 00:07:28.856 [2024-07-15 20:21:21.114181] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:28.856 [2024-07-15 20:21:21.219243] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:28.856 [2024-07-15 20:21:21.219345] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:28.856 [2024-07-15 20:21:21.219449] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.856 [2024-07-15 20:21:21.219447] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:30.230 Running I/O for 1 seconds... 00:07:30.230 lcore 0: 102724 00:07:30.230 lcore 1: 102726 00:07:30.230 lcore 2: 102729 00:07:30.230 lcore 3: 102725 00:07:30.230 done. 00:07:30.230 00:07:30.230 real 0m1.358s 00:07:30.230 user 0m4.199s 00:07:30.230 sys 0m0.146s 00:07:30.231 20:21:22 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:30.231 20:21:22 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:30.231 ************************************ 00:07:30.231 END TEST event_perf 00:07:30.231 ************************************ 00:07:30.231 20:21:22 event -- common/autotest_common.sh@1142 -- # return 0 00:07:30.231 20:21:22 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:30.231 20:21:22 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:30.231 20:21:22 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:30.231 20:21:22 event -- common/autotest_common.sh@10 -- # set +x 00:07:30.231 ************************************ 00:07:30.231 START TEST event_reactor 00:07:30.231 ************************************ 00:07:30.231 20:21:22 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:30.231 [2024-07-15 20:21:22.431081] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:30.231 [2024-07-15 20:21:22.431142] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1315601 ] 00:07:30.231 [2024-07-15 20:21:22.558210] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.489 [2024-07-15 20:21:22.659210] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.424 test_start 00:07:31.424 oneshot 00:07:31.424 tick 100 00:07:31.424 tick 100 00:07:31.424 tick 250 00:07:31.424 tick 100 00:07:31.424 tick 100 00:07:31.424 tick 100 00:07:31.424 tick 250 00:07:31.424 tick 500 00:07:31.424 tick 100 00:07:31.424 tick 100 00:07:31.424 tick 250 00:07:31.424 tick 100 00:07:31.424 tick 100 00:07:31.424 test_end 00:07:31.424 00:07:31.424 real 0m1.351s 00:07:31.424 user 0m1.206s 00:07:31.424 sys 0m0.139s 00:07:31.424 20:21:23 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:31.424 20:21:23 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:31.424 ************************************ 00:07:31.424 END TEST event_reactor 00:07:31.424 ************************************ 00:07:31.424 20:21:23 event -- common/autotest_common.sh@1142 -- # return 0 00:07:31.424 20:21:23 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:31.424 20:21:23 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:31.424 20:21:23 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.424 20:21:23 event -- common/autotest_common.sh@10 -- # set +x 00:07:31.696 ************************************ 00:07:31.696 START TEST event_reactor_perf 00:07:31.696 ************************************ 00:07:31.696 20:21:23 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:31.696 [2024-07-15 20:21:23.865587] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:31.696 [2024-07-15 20:21:23.865648] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1315795 ] 00:07:31.696 [2024-07-15 20:21:23.994176] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.959 [2024-07-15 20:21:24.098683] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.893 test_start 00:07:32.893 test_end 00:07:32.893 Performance: 328261 events per second 00:07:32.893 00:07:32.893 real 0m1.353s 00:07:32.893 user 0m1.204s 00:07:32.893 sys 0m0.142s 00:07:32.893 20:21:25 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:32.893 20:21:25 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:32.893 ************************************ 00:07:32.893 END TEST event_reactor_perf 00:07:32.893 ************************************ 00:07:32.893 20:21:25 event -- common/autotest_common.sh@1142 -- # return 0 00:07:32.893 20:21:25 event -- event/event.sh@49 -- # uname -s 00:07:32.893 20:21:25 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:32.893 20:21:25 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:32.893 20:21:25 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:32.893 20:21:25 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:32.893 20:21:25 event -- common/autotest_common.sh@10 -- # set +x 00:07:33.150 ************************************ 00:07:33.150 START TEST event_scheduler 00:07:33.150 ************************************ 00:07:33.150 20:21:25 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:33.150 * Looking for test storage... 00:07:33.150 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:07:33.150 20:21:25 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:33.150 20:21:25 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1316031 00:07:33.150 20:21:25 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:33.150 20:21:25 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:33.150 20:21:25 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1316031 00:07:33.150 20:21:25 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 1316031 ']' 00:07:33.150 20:21:25 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:33.150 20:21:25 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:33.150 20:21:25 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:33.150 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:33.150 20:21:25 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:33.150 20:21:25 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:33.150 [2024-07-15 20:21:25.456919] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:33.151 [2024-07-15 20:21:25.456997] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1316031 ] 00:07:33.409 [2024-07-15 20:21:25.653127] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:33.667 [2024-07-15 20:21:25.841008] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.667 [2024-07-15 20:21:25.841098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:33.667 [2024-07-15 20:21:25.841200] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:33.667 [2024-07-15 20:21:25.841212] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:34.239 20:21:26 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:34.240 20:21:26 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:07:34.240 20:21:26 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:34.240 20:21:26 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 [2024-07-15 20:21:26.340298] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:34.240 [2024-07-15 20:21:26.340363] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:07:34.240 [2024-07-15 20:21:26.340410] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:34.240 [2024-07-15 20:21:26.340443] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:34.240 [2024-07-15 20:21:26.340475] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:34.240 20:21:26 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.240 20:21:26 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:34.240 20:21:26 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 [2024-07-15 20:21:26.479253] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:34.240 20:21:26 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.240 20:21:26 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:34.240 20:21:26 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:34.240 20:21:26 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 ************************************ 00:07:34.240 START TEST scheduler_create_thread 00:07:34.240 ************************************ 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 2 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 3 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 4 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 5 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 6 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 7 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 8 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 9 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.240 10 00:07:34.240 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.498 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:34.498 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.498 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.498 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.498 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:34.498 20:21:26 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:34.498 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.498 20:21:26 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:35.432 20:21:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:35.432 20:21:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:35.433 20:21:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:35.433 20:21:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:36.801 20:21:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:36.801 20:21:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:36.801 20:21:28 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:36.801 20:21:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:36.801 20:21:28 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:37.731 20:21:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:37.731 00:07:37.731 real 0m3.386s 00:07:37.731 user 0m0.024s 00:07:37.731 sys 0m0.008s 00:07:37.731 20:21:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:37.731 20:21:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:37.731 ************************************ 00:07:37.731 END TEST scheduler_create_thread 00:07:37.731 ************************************ 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:07:37.731 20:21:29 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:37.731 20:21:29 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1316031 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 1316031 ']' 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 1316031 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1316031 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1316031' 00:07:37.731 killing process with pid 1316031 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 1316031 00:07:37.731 20:21:29 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 1316031 00:07:37.990 [2024-07-15 20:21:30.286487] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:38.559 00:07:38.559 real 0m5.375s 00:07:38.559 user 0m10.164s 00:07:38.559 sys 0m0.623s 00:07:38.559 20:21:30 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.559 20:21:30 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:38.559 ************************************ 00:07:38.559 END TEST event_scheduler 00:07:38.559 ************************************ 00:07:38.559 20:21:30 event -- common/autotest_common.sh@1142 -- # return 0 00:07:38.559 20:21:30 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:38.559 20:21:30 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:38.559 20:21:30 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:38.559 20:21:30 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.559 20:21:30 event -- common/autotest_common.sh@10 -- # set +x 00:07:38.559 ************************************ 00:07:38.559 START TEST app_repeat 00:07:38.559 ************************************ 00:07:38.559 20:21:30 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1316782 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1316782' 00:07:38.559 Process app_repeat pid: 1316782 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:38.559 spdk_app_start Round 0 00:07:38.559 20:21:30 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1316782 /var/tmp/spdk-nbd.sock 00:07:38.559 20:21:30 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1316782 ']' 00:07:38.559 20:21:30 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:38.559 20:21:30 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:38.559 20:21:30 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:38.559 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:38.559 20:21:30 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:38.559 20:21:30 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:38.559 [2024-07-15 20:21:30.801682] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:38.559 [2024-07-15 20:21:30.801755] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1316782 ] 00:07:38.559 [2024-07-15 20:21:30.922586] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:38.818 [2024-07-15 20:21:31.033369] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:38.818 [2024-07-15 20:21:31.033374] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.076 20:21:31 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:39.076 20:21:31 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:39.076 20:21:31 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:39.335 Malloc0 00:07:39.335 20:21:31 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:39.594 Malloc1 00:07:39.594 20:21:31 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:39.594 20:21:31 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:39.854 /dev/nbd0 00:07:39.854 20:21:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:39.854 20:21:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:39.854 1+0 records in 00:07:39.854 1+0 records out 00:07:39.854 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022998 s, 17.8 MB/s 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:39.854 20:21:32 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:39.854 20:21:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:39.854 20:21:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:39.854 20:21:32 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:40.112 /dev/nbd1 00:07:40.112 20:21:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:40.112 20:21:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:40.112 20:21:32 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:40.112 20:21:32 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:40.112 20:21:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:40.112 20:21:32 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:40.112 20:21:32 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:40.112 20:21:32 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:40.112 20:21:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:40.112 20:21:32 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:40.113 20:21:32 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:40.113 1+0 records in 00:07:40.113 1+0 records out 00:07:40.113 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271343 s, 15.1 MB/s 00:07:40.113 20:21:32 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:40.113 20:21:32 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:40.113 20:21:32 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:40.113 20:21:32 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:40.113 20:21:32 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:40.113 20:21:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:40.113 20:21:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:40.113 20:21:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:40.113 20:21:32 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:40.113 20:21:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:40.371 { 00:07:40.371 "nbd_device": "/dev/nbd0", 00:07:40.371 "bdev_name": "Malloc0" 00:07:40.371 }, 00:07:40.371 { 00:07:40.371 "nbd_device": "/dev/nbd1", 00:07:40.371 "bdev_name": "Malloc1" 00:07:40.371 } 00:07:40.371 ]' 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:40.371 { 00:07:40.371 "nbd_device": "/dev/nbd0", 00:07:40.371 "bdev_name": "Malloc0" 00:07:40.371 }, 00:07:40.371 { 00:07:40.371 "nbd_device": "/dev/nbd1", 00:07:40.371 "bdev_name": "Malloc1" 00:07:40.371 } 00:07:40.371 ]' 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:40.371 /dev/nbd1' 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:40.371 /dev/nbd1' 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:40.371 256+0 records in 00:07:40.371 256+0 records out 00:07:40.371 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107918 s, 97.2 MB/s 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:40.371 256+0 records in 00:07:40.371 256+0 records out 00:07:40.371 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.030258 s, 34.7 MB/s 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:40.371 20:21:32 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:40.630 256+0 records in 00:07:40.630 256+0 records out 00:07:40.630 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0317443 s, 33.0 MB/s 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.630 20:21:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:40.888 20:21:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:40.888 20:21:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:40.888 20:21:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:40.888 20:21:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.888 20:21:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.888 20:21:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:40.888 20:21:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:40.888 20:21:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.888 20:21:33 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.888 20:21:33 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:41.146 20:21:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:41.404 20:21:33 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:41.404 20:21:33 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:41.662 20:21:33 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:41.662 [2024-07-15 20:21:34.040554] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:41.919 [2024-07-15 20:21:34.139344] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:41.919 [2024-07-15 20:21:34.139348] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.919 [2024-07-15 20:21:34.191383] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:41.919 [2024-07-15 20:21:34.191432] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:44.462 20:21:36 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:44.462 20:21:36 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:44.462 spdk_app_start Round 1 00:07:44.462 20:21:36 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1316782 /var/tmp/spdk-nbd.sock 00:07:44.462 20:21:36 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1316782 ']' 00:07:44.462 20:21:36 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:44.462 20:21:36 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:44.462 20:21:36 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:44.462 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:44.462 20:21:36 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:44.462 20:21:36 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:44.720 20:21:37 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:44.720 20:21:37 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:44.720 20:21:37 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:44.978 Malloc0 00:07:44.978 20:21:37 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:45.237 Malloc1 00:07:45.237 20:21:37 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:45.237 20:21:37 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:45.496 20:21:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:45.496 /dev/nbd0 00:07:45.755 20:21:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:45.755 20:21:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:45.755 1+0 records in 00:07:45.755 1+0 records out 00:07:45.755 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238635 s, 17.2 MB/s 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:45.755 20:21:37 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:45.755 20:21:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:45.755 20:21:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:45.755 20:21:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:46.014 /dev/nbd1 00:07:46.014 20:21:38 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:46.014 20:21:38 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:46.014 1+0 records in 00:07:46.014 1+0 records out 00:07:46.014 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269166 s, 15.2 MB/s 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:46.014 20:21:38 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:46.014 20:21:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:46.014 20:21:38 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:46.014 20:21:38 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:46.014 20:21:38 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:46.014 20:21:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:46.272 { 00:07:46.272 "nbd_device": "/dev/nbd0", 00:07:46.272 "bdev_name": "Malloc0" 00:07:46.272 }, 00:07:46.272 { 00:07:46.272 "nbd_device": "/dev/nbd1", 00:07:46.272 "bdev_name": "Malloc1" 00:07:46.272 } 00:07:46.272 ]' 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:46.272 { 00:07:46.272 "nbd_device": "/dev/nbd0", 00:07:46.272 "bdev_name": "Malloc0" 00:07:46.272 }, 00:07:46.272 { 00:07:46.272 "nbd_device": "/dev/nbd1", 00:07:46.272 "bdev_name": "Malloc1" 00:07:46.272 } 00:07:46.272 ]' 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:46.272 /dev/nbd1' 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:46.272 /dev/nbd1' 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:46.272 256+0 records in 00:07:46.272 256+0 records out 00:07:46.272 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00673452 s, 156 MB/s 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:46.272 256+0 records in 00:07:46.272 256+0 records out 00:07:46.272 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0301136 s, 34.8 MB/s 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:46.272 256+0 records in 00:07:46.272 256+0 records out 00:07:46.272 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0314202 s, 33.4 MB/s 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:46.272 20:21:38 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:46.273 20:21:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:46.273 20:21:38 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:46.273 20:21:38 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:46.273 20:21:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.273 20:21:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:46.531 20:21:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:46.531 20:21:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:46.531 20:21:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:46.531 20:21:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.531 20:21:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.531 20:21:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:46.531 20:21:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:46.531 20:21:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.531 20:21:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:46.531 20:21:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:46.789 20:21:39 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:47.047 20:21:39 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:47.047 20:21:39 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:47.305 20:21:39 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:47.564 [2024-07-15 20:21:39.876999] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:47.823 [2024-07-15 20:21:39.976772] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:47.823 [2024-07-15 20:21:39.976776] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.823 [2024-07-15 20:21:40.030469] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:47.823 [2024-07-15 20:21:40.030517] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:50.350 20:21:42 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:50.350 20:21:42 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:50.350 spdk_app_start Round 2 00:07:50.350 20:21:42 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1316782 /var/tmp/spdk-nbd.sock 00:07:50.350 20:21:42 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1316782 ']' 00:07:50.350 20:21:42 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:50.350 20:21:42 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:50.350 20:21:42 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:50.350 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:50.350 20:21:42 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:50.350 20:21:42 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:50.609 20:21:42 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:50.609 20:21:42 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:50.609 20:21:42 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:50.952 Malloc0 00:07:50.952 20:21:43 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:51.209 Malloc1 00:07:51.209 20:21:43 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:51.209 20:21:43 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:51.467 /dev/nbd0 00:07:51.467 20:21:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:51.467 20:21:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:51.467 1+0 records in 00:07:51.467 1+0 records out 00:07:51.467 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242375 s, 16.9 MB/s 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.467 20:21:43 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:51.467 20:21:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.467 20:21:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:51.467 20:21:43 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:51.724 /dev/nbd1 00:07:51.724 20:21:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:51.724 20:21:43 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:51.724 1+0 records in 00:07:51.724 1+0 records out 00:07:51.724 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284506 s, 14.4 MB/s 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:51.724 20:21:43 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:51.724 20:21:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:51.724 20:21:43 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:51.724 20:21:43 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:51.724 20:21:43 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:51.724 20:21:43 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:51.981 { 00:07:51.981 "nbd_device": "/dev/nbd0", 00:07:51.981 "bdev_name": "Malloc0" 00:07:51.981 }, 00:07:51.981 { 00:07:51.981 "nbd_device": "/dev/nbd1", 00:07:51.981 "bdev_name": "Malloc1" 00:07:51.981 } 00:07:51.981 ]' 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:51.981 { 00:07:51.981 "nbd_device": "/dev/nbd0", 00:07:51.981 "bdev_name": "Malloc0" 00:07:51.981 }, 00:07:51.981 { 00:07:51.981 "nbd_device": "/dev/nbd1", 00:07:51.981 "bdev_name": "Malloc1" 00:07:51.981 } 00:07:51.981 ]' 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:51.981 /dev/nbd1' 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:51.981 /dev/nbd1' 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:51.981 256+0 records in 00:07:51.981 256+0 records out 00:07:51.981 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103673 s, 101 MB/s 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:51.981 256+0 records in 00:07:51.981 256+0 records out 00:07:51.981 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0334186 s, 31.4 MB/s 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:51.981 256+0 records in 00:07:51.981 256+0 records out 00:07:51.981 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0316443 s, 33.1 MB/s 00:07:51.981 20:21:44 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:52.238 20:21:44 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:52.496 20:21:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:52.496 20:21:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:52.496 20:21:44 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:52.496 20:21:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:52.496 20:21:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:52.496 20:21:44 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:52.496 20:21:44 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:52.496 20:21:44 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:52.496 20:21:44 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:52.496 20:21:44 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:52.752 20:21:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:53.009 20:21:45 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:53.009 20:21:45 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:53.267 20:21:45 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:53.526 [2024-07-15 20:21:45.775192] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:53.526 [2024-07-15 20:21:45.879939] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:53.526 [2024-07-15 20:21:45.879942] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.784 [2024-07-15 20:21:45.925339] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:53.784 [2024-07-15 20:21:45.925390] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:56.312 20:21:48 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1316782 /var/tmp/spdk-nbd.sock 00:07:56.312 20:21:48 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1316782 ']' 00:07:56.312 20:21:48 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:56.312 20:21:48 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:56.312 20:21:48 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:56.312 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:56.312 20:21:48 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:56.312 20:21:48 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:56.571 20:21:48 event.app_repeat -- event/event.sh@39 -- # killprocess 1316782 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 1316782 ']' 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 1316782 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1316782 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1316782' 00:07:56.571 killing process with pid 1316782 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@967 -- # kill 1316782 00:07:56.571 20:21:48 event.app_repeat -- common/autotest_common.sh@972 -- # wait 1316782 00:07:56.829 spdk_app_start is called in Round 0. 00:07:56.829 Shutdown signal received, stop current app iteration 00:07:56.829 Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 reinitialization... 00:07:56.829 spdk_app_start is called in Round 1. 00:07:56.829 Shutdown signal received, stop current app iteration 00:07:56.829 Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 reinitialization... 00:07:56.829 spdk_app_start is called in Round 2. 00:07:56.829 Shutdown signal received, stop current app iteration 00:07:56.829 Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 reinitialization... 00:07:56.829 spdk_app_start is called in Round 3. 00:07:56.829 Shutdown signal received, stop current app iteration 00:07:56.829 20:21:49 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:56.829 20:21:49 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:56.829 00:07:56.829 real 0m18.315s 00:07:56.829 user 0m39.731s 00:07:56.829 sys 0m3.852s 00:07:56.829 20:21:49 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:56.829 20:21:49 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:56.829 ************************************ 00:07:56.830 END TEST app_repeat 00:07:56.830 ************************************ 00:07:56.830 20:21:49 event -- common/autotest_common.sh@1142 -- # return 0 00:07:56.830 20:21:49 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:56.830 00:07:56.830 real 0m28.307s 00:07:56.830 user 0m56.718s 00:07:56.830 sys 0m5.285s 00:07:56.830 20:21:49 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:56.830 20:21:49 event -- common/autotest_common.sh@10 -- # set +x 00:07:56.830 ************************************ 00:07:56.830 END TEST event 00:07:56.830 ************************************ 00:07:56.830 20:21:49 -- common/autotest_common.sh@1142 -- # return 0 00:07:56.830 20:21:49 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:56.830 20:21:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:56.830 20:21:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:56.830 20:21:49 -- common/autotest_common.sh@10 -- # set +x 00:07:56.830 ************************************ 00:07:56.830 START TEST thread 00:07:56.830 ************************************ 00:07:56.830 20:21:49 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:57.089 * Looking for test storage... 00:07:57.089 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:57.089 20:21:49 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:57.089 20:21:49 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:57.089 20:21:49 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.089 20:21:49 thread -- common/autotest_common.sh@10 -- # set +x 00:07:57.089 ************************************ 00:07:57.089 START TEST thread_poller_perf 00:07:57.089 ************************************ 00:07:57.089 20:21:49 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:57.089 [2024-07-15 20:21:49.379270] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:57.089 [2024-07-15 20:21:49.379336] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1319480 ] 00:07:57.348 [2024-07-15 20:21:49.510393] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.348 [2024-07-15 20:21:49.616155] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.348 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:58.721 ====================================== 00:07:58.721 busy:2311372856 (cyc) 00:07:58.721 total_run_count: 261000 00:07:58.721 tsc_hz: 2300000000 (cyc) 00:07:58.721 ====================================== 00:07:58.721 poller_cost: 8855 (cyc), 3850 (nsec) 00:07:58.721 00:07:58.721 real 0m1.369s 00:07:58.721 user 0m1.215s 00:07:58.721 sys 0m0.147s 00:07:58.721 20:21:50 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:58.721 20:21:50 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:58.722 ************************************ 00:07:58.722 END TEST thread_poller_perf 00:07:58.722 ************************************ 00:07:58.722 20:21:50 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:58.722 20:21:50 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:58.722 20:21:50 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:58.722 20:21:50 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:58.722 20:21:50 thread -- common/autotest_common.sh@10 -- # set +x 00:07:58.722 ************************************ 00:07:58.722 START TEST thread_poller_perf 00:07:58.722 ************************************ 00:07:58.722 20:21:50 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:58.722 [2024-07-15 20:21:50.831963] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:07:58.722 [2024-07-15 20:21:50.832039] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1319685 ] 00:07:58.722 [2024-07-15 20:21:50.960852] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.722 [2024-07-15 20:21:51.066460] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.722 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:08:00.097 ====================================== 00:08:00.097 busy:2302617162 (cyc) 00:08:00.097 total_run_count: 3495000 00:08:00.097 tsc_hz: 2300000000 (cyc) 00:08:00.097 ====================================== 00:08:00.097 poller_cost: 658 (cyc), 286 (nsec) 00:08:00.097 00:08:00.097 real 0m1.353s 00:08:00.097 user 0m1.196s 00:08:00.097 sys 0m0.150s 00:08:00.097 20:21:52 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.097 20:21:52 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:08:00.097 ************************************ 00:08:00.097 END TEST thread_poller_perf 00:08:00.097 ************************************ 00:08:00.097 20:21:52 thread -- common/autotest_common.sh@1142 -- # return 0 00:08:00.097 20:21:52 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:08:00.097 00:08:00.097 real 0m2.994s 00:08:00.097 user 0m2.514s 00:08:00.097 sys 0m0.490s 00:08:00.098 20:21:52 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.098 20:21:52 thread -- common/autotest_common.sh@10 -- # set +x 00:08:00.098 ************************************ 00:08:00.098 END TEST thread 00:08:00.098 ************************************ 00:08:00.098 20:21:52 -- common/autotest_common.sh@1142 -- # return 0 00:08:00.098 20:21:52 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:08:00.098 20:21:52 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:00.098 20:21:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:00.098 20:21:52 -- common/autotest_common.sh@10 -- # set +x 00:08:00.098 ************************************ 00:08:00.098 START TEST accel 00:08:00.098 ************************************ 00:08:00.098 20:21:52 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:08:00.098 * Looking for test storage... 00:08:00.098 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:00.098 20:21:52 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:08:00.098 20:21:52 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:08:00.098 20:21:52 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:00.098 20:21:52 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1319927 00:08:00.098 20:21:52 accel -- accel/accel.sh@63 -- # waitforlisten 1319927 00:08:00.098 20:21:52 accel -- common/autotest_common.sh@829 -- # '[' -z 1319927 ']' 00:08:00.098 20:21:52 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:00.098 20:21:52 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:00.098 20:21:52 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:00.098 20:21:52 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:00.098 20:21:52 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:00.098 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:00.098 20:21:52 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.098 20:21:52 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:00.098 20:21:52 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.098 20:21:52 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.098 20:21:52 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.098 20:21:52 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.098 20:21:52 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:00.098 20:21:52 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:00.098 20:21:52 accel -- accel/accel.sh@41 -- # jq -r . 00:08:00.098 [2024-07-15 20:21:52.456650] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:00.098 [2024-07-15 20:21:52.456712] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1319927 ] 00:08:00.356 [2024-07-15 20:21:52.561078] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.356 [2024-07-15 20:21:52.661879] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.923 20:21:53 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:00.923 20:21:53 accel -- common/autotest_common.sh@862 -- # return 0 00:08:00.923 20:21:53 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:00.923 20:21:53 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:00.923 20:21:53 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:00.923 20:21:53 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:08:00.923 20:21:53 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:00.923 20:21:53 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:00.923 20:21:53 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:00.923 20:21:53 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:00.923 20:21:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.923 20:21:53 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:01.181 20:21:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:01.181 20:21:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:01.181 20:21:53 accel -- accel/accel.sh@75 -- # killprocess 1319927 00:08:01.181 20:21:53 accel -- common/autotest_common.sh@948 -- # '[' -z 1319927 ']' 00:08:01.181 20:21:53 accel -- common/autotest_common.sh@952 -- # kill -0 1319927 00:08:01.181 20:21:53 accel -- common/autotest_common.sh@953 -- # uname 00:08:01.181 20:21:53 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:01.181 20:21:53 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1319927 00:08:01.181 20:21:53 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:01.181 20:21:53 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:01.181 20:21:53 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1319927' 00:08:01.181 killing process with pid 1319927 00:08:01.181 20:21:53 accel -- common/autotest_common.sh@967 -- # kill 1319927 00:08:01.181 20:21:53 accel -- common/autotest_common.sh@972 -- # wait 1319927 00:08:01.440 20:21:53 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:01.440 20:21:53 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:08:01.440 20:21:53 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:01.440 20:21:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:01.440 20:21:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:01.440 20:21:53 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:08:01.440 20:21:53 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:08:01.440 20:21:53 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:08:01.440 20:21:53 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:01.440 20:21:53 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:01.440 20:21:53 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:01.440 20:21:53 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:01.440 20:21:53 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:01.440 20:21:53 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:08:01.440 20:21:53 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:08:01.440 20:21:53 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:01.698 20:21:53 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:08:01.698 20:21:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:01.698 20:21:53 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:08:01.698 20:21:53 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:01.698 20:21:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:01.698 20:21:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:01.698 ************************************ 00:08:01.698 START TEST accel_missing_filename 00:08:01.699 ************************************ 00:08:01.699 20:21:53 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:08:01.699 20:21:53 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:08:01.699 20:21:53 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:08:01.699 20:21:53 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:01.699 20:21:53 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:01.699 20:21:53 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:01.699 20:21:53 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:01.699 20:21:53 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:08:01.699 20:21:53 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:08:01.699 20:21:53 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:08:01.699 20:21:53 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:01.699 20:21:53 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:01.699 20:21:53 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:01.699 20:21:53 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:01.699 20:21:53 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:01.699 20:21:53 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:08:01.699 20:21:53 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:08:01.699 [2024-07-15 20:21:53.931994] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:01.699 [2024-07-15 20:21:53.932060] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1320221 ] 00:08:01.699 [2024-07-15 20:21:54.065309] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.957 [2024-07-15 20:21:54.171366] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.957 [2024-07-15 20:21:54.239921] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:01.957 [2024-07-15 20:21:54.313570] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:08:02.216 A filename is required. 00:08:02.216 20:21:54 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:08:02.216 20:21:54 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:02.216 20:21:54 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:08:02.216 20:21:54 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:08:02.216 20:21:54 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:08:02.216 20:21:54 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:02.216 00:08:02.216 real 0m0.514s 00:08:02.216 user 0m0.347s 00:08:02.216 sys 0m0.196s 00:08:02.216 20:21:54 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.216 20:21:54 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:08:02.216 ************************************ 00:08:02.216 END TEST accel_missing_filename 00:08:02.216 ************************************ 00:08:02.216 20:21:54 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:02.216 20:21:54 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:02.216 20:21:54 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:08:02.216 20:21:54 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.216 20:21:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.216 ************************************ 00:08:02.216 START TEST accel_compress_verify 00:08:02.216 ************************************ 00:08:02.216 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:02.216 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:08:02.216 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:02.216 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:02.216 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:02.216 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:02.216 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:02.216 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:02.216 20:21:54 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:02.216 20:21:54 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:02.216 20:21:54 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.216 20:21:54 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.216 20:21:54 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.216 20:21:54 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.216 20:21:54 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.216 20:21:54 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:02.216 20:21:54 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:08:02.216 [2024-07-15 20:21:54.530723] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:02.216 [2024-07-15 20:21:54.530791] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1320319 ] 00:08:02.475 [2024-07-15 20:21:54.662314] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.475 [2024-07-15 20:21:54.765644] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.475 [2024-07-15 20:21:54.825682] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:02.735 [2024-07-15 20:21:54.890102] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:08:02.735 00:08:02.735 Compression does not support the verify option, aborting. 00:08:02.735 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:08:02.735 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:02.735 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:08:02.735 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:08:02.735 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:08:02.735 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:02.735 00:08:02.735 real 0m0.491s 00:08:02.735 user 0m0.329s 00:08:02.735 sys 0m0.191s 00:08:02.735 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.735 20:21:54 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:08:02.735 ************************************ 00:08:02.735 END TEST accel_compress_verify 00:08:02.735 ************************************ 00:08:02.735 20:21:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:02.735 20:21:55 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:08:02.735 20:21:55 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:02.735 20:21:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.735 20:21:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.735 ************************************ 00:08:02.735 START TEST accel_wrong_workload 00:08:02.735 ************************************ 00:08:02.735 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:08:02.736 20:21:55 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:08:02.736 20:21:55 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:08:02.736 20:21:55 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.736 20:21:55 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.736 20:21:55 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.736 20:21:55 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.736 20:21:55 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.736 20:21:55 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:08:02.736 20:21:55 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:08:02.736 Unsupported workload type: foobar 00:08:02.736 [2024-07-15 20:21:55.101342] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:08:02.736 accel_perf options: 00:08:02.736 [-h help message] 00:08:02.736 [-q queue depth per core] 00:08:02.736 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:02.736 [-T number of threads per core 00:08:02.736 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:02.736 [-t time in seconds] 00:08:02.736 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:02.736 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:02.736 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:02.736 [-l for compress/decompress workloads, name of uncompressed input file 00:08:02.736 [-S for crc32c workload, use this seed value (default 0) 00:08:02.736 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:02.736 [-f for fill workload, use this BYTE value (default 255) 00:08:02.736 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:02.736 [-y verify result if this switch is on] 00:08:02.736 [-a tasks to allocate per core (default: same value as -q)] 00:08:02.736 Can be used to spread operations across a wider range of memory. 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:02.736 00:08:02.736 real 0m0.043s 00:08:02.736 user 0m0.021s 00:08:02.736 sys 0m0.022s 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.736 20:21:55 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:08:02.736 ************************************ 00:08:02.736 END TEST accel_wrong_workload 00:08:02.736 ************************************ 00:08:02.736 Error: writing output failed: Broken pipe 00:08:02.995 20:21:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:02.995 20:21:55 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:08:02.995 20:21:55 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:08:02.995 20:21:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.995 20:21:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.995 ************************************ 00:08:02.995 START TEST accel_negative_buffers 00:08:02.995 ************************************ 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:08:02.995 20:21:55 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:08:02.995 20:21:55 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:08:02.995 20:21:55 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.995 20:21:55 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.995 20:21:55 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.995 20:21:55 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.995 20:21:55 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.995 20:21:55 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:08:02.995 20:21:55 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:08:02.995 -x option must be non-negative. 00:08:02.995 [2024-07-15 20:21:55.223247] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:08:02.995 accel_perf options: 00:08:02.995 [-h help message] 00:08:02.995 [-q queue depth per core] 00:08:02.995 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:02.995 [-T number of threads per core 00:08:02.995 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:02.995 [-t time in seconds] 00:08:02.995 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:02.995 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:02.995 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:02.995 [-l for compress/decompress workloads, name of uncompressed input file 00:08:02.995 [-S for crc32c workload, use this seed value (default 0) 00:08:02.995 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:02.995 [-f for fill workload, use this BYTE value (default 255) 00:08:02.995 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:02.995 [-y verify result if this switch is on] 00:08:02.995 [-a tasks to allocate per core (default: same value as -q)] 00:08:02.995 Can be used to spread operations across a wider range of memory. 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:02.995 00:08:02.995 real 0m0.040s 00:08:02.995 user 0m0.023s 00:08:02.995 sys 0m0.017s 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.995 20:21:55 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:08:02.995 ************************************ 00:08:02.995 END TEST accel_negative_buffers 00:08:02.995 ************************************ 00:08:02.995 Error: writing output failed: Broken pipe 00:08:02.996 20:21:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:02.996 20:21:55 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:08:02.996 20:21:55 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:02.996 20:21:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.996 20:21:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.996 ************************************ 00:08:02.996 START TEST accel_crc32c 00:08:02.996 ************************************ 00:08:02.996 20:21:55 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:02.996 20:21:55 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:02.996 [2024-07-15 20:21:55.343410] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:02.996 [2024-07-15 20:21:55.343476] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1320513 ] 00:08:03.255 [2024-07-15 20:21:55.473457] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.255 [2024-07-15 20:21:55.579551] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.514 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.515 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:03.515 20:21:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:03.515 20:21:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:03.515 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:03.515 20:21:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.448 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:04.708 20:21:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:04.708 00:08:04.708 real 0m1.521s 00:08:04.708 user 0m1.331s 00:08:04.708 sys 0m0.191s 00:08:04.708 20:21:56 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:04.708 20:21:56 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:04.708 ************************************ 00:08:04.708 END TEST accel_crc32c 00:08:04.708 ************************************ 00:08:04.708 20:21:56 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:04.708 20:21:56 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:08:04.708 20:21:56 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:04.708 20:21:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:04.708 20:21:56 accel -- common/autotest_common.sh@10 -- # set +x 00:08:04.708 ************************************ 00:08:04.708 START TEST accel_crc32c_C2 00:08:04.708 ************************************ 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:04.708 20:21:56 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:04.708 [2024-07-15 20:21:56.953082] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:04.708 [2024-07-15 20:21:56.953145] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1320748 ] 00:08:04.708 [2024-07-15 20:21:57.081687] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.967 [2024-07-15 20:21:57.183083] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:04.967 20:21:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:06.389 00:08:06.389 real 0m1.505s 00:08:06.389 user 0m1.315s 00:08:06.389 sys 0m0.194s 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:06.389 20:21:58 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:06.389 ************************************ 00:08:06.389 END TEST accel_crc32c_C2 00:08:06.389 ************************************ 00:08:06.389 20:21:58 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:06.389 20:21:58 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:08:06.389 20:21:58 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:06.389 20:21:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:06.389 20:21:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:06.389 ************************************ 00:08:06.389 START TEST accel_copy 00:08:06.389 ************************************ 00:08:06.389 20:21:58 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:06.389 20:21:58 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:08:06.389 [2024-07-15 20:21:58.540998] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:06.389 [2024-07-15 20:21:58.541064] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1320943 ] 00:08:06.389 [2024-07-15 20:21:58.669486] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.677 [2024-07-15 20:21:58.771019] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:06.677 20:21:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:08:07.645 20:21:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:07.645 00:08:07.645 real 0m1.495s 00:08:07.645 user 0m1.321s 00:08:07.645 sys 0m0.178s 00:08:07.645 20:21:59 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:07.645 20:21:59 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:08:07.645 ************************************ 00:08:07.645 END TEST accel_copy 00:08:07.645 ************************************ 00:08:07.903 20:22:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:07.903 20:22:00 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:07.903 20:22:00 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:07.903 20:22:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.903 20:22:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:07.903 ************************************ 00:08:07.903 START TEST accel_fill 00:08:07.903 ************************************ 00:08:07.903 20:22:00 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:08:07.903 20:22:00 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:08:07.903 [2024-07-15 20:22:00.120337] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:07.903 [2024-07-15 20:22:00.120399] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1321148 ] 00:08:07.903 [2024-07-15 20:22:00.248935] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.162 [2024-07-15 20:22:00.350537] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.162 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:08.163 20:22:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:08:09.538 20:22:01 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:09.538 00:08:09.538 real 0m1.508s 00:08:09.538 user 0m1.310s 00:08:09.538 sys 0m0.202s 00:08:09.538 20:22:01 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:09.538 20:22:01 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:08:09.538 ************************************ 00:08:09.538 END TEST accel_fill 00:08:09.538 ************************************ 00:08:09.538 20:22:01 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:09.538 20:22:01 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:08:09.538 20:22:01 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:09.538 20:22:01 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:09.538 20:22:01 accel -- common/autotest_common.sh@10 -- # set +x 00:08:09.538 ************************************ 00:08:09.538 START TEST accel_copy_crc32c 00:08:09.538 ************************************ 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:09.538 20:22:01 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:09.538 [2024-07-15 20:22:01.708825] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:09.538 [2024-07-15 20:22:01.708907] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1321342 ] 00:08:09.538 [2024-07-15 20:22:01.854248] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.797 [2024-07-15 20:22:01.961219] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:09.797 20:22:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:11.169 00:08:11.169 real 0m1.527s 00:08:11.169 user 0m1.332s 00:08:11.169 sys 0m0.198s 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:11.169 20:22:03 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:11.169 ************************************ 00:08:11.169 END TEST accel_copy_crc32c 00:08:11.169 ************************************ 00:08:11.169 20:22:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:11.169 20:22:03 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:08:11.169 20:22:03 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:11.169 20:22:03 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.169 20:22:03 accel -- common/autotest_common.sh@10 -- # set +x 00:08:11.169 ************************************ 00:08:11.169 START TEST accel_copy_crc32c_C2 00:08:11.169 ************************************ 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:11.169 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:11.169 [2024-07-15 20:22:03.324989] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:11.169 [2024-07-15 20:22:03.325051] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1321587 ] 00:08:11.169 [2024-07-15 20:22:03.450085] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.427 [2024-07-15 20:22:03.552749] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:11.427 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:11.428 20:22:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:12.798 00:08:12.798 real 0m1.503s 00:08:12.798 user 0m1.329s 00:08:12.798 sys 0m0.172s 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:12.798 20:22:04 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:12.798 ************************************ 00:08:12.798 END TEST accel_copy_crc32c_C2 00:08:12.798 ************************************ 00:08:12.798 20:22:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:12.798 20:22:04 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:08:12.798 20:22:04 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:12.798 20:22:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:12.798 20:22:04 accel -- common/autotest_common.sh@10 -- # set +x 00:08:12.798 ************************************ 00:08:12.798 START TEST accel_dualcast 00:08:12.798 ************************************ 00:08:12.798 20:22:04 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:08:12.798 20:22:04 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:08:12.798 [2024-07-15 20:22:04.906859] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:12.798 [2024-07-15 20:22:04.906921] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1321903 ] 00:08:12.798 [2024-07-15 20:22:05.035786] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.798 [2024-07-15 20:22:05.136195] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.056 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.057 20:22:05 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.991 20:22:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.991 20:22:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.991 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.991 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.991 20:22:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:13.991 20:22:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:13.991 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:08:14.249 20:22:06 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:14.249 00:08:14.249 real 0m1.503s 00:08:14.249 user 0m1.324s 00:08:14.249 sys 0m0.182s 00:08:14.249 20:22:06 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:14.249 20:22:06 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:08:14.249 ************************************ 00:08:14.249 END TEST accel_dualcast 00:08:14.249 ************************************ 00:08:14.249 20:22:06 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:14.249 20:22:06 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:08:14.249 20:22:06 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:14.249 20:22:06 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:14.249 20:22:06 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.249 ************************************ 00:08:14.249 START TEST accel_compare 00:08:14.249 ************************************ 00:08:14.249 20:22:06 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:08:14.249 20:22:06 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:08:14.249 20:22:06 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:08:14.249 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.249 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.249 20:22:06 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:08:14.250 20:22:06 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:14.250 20:22:06 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:08:14.250 20:22:06 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:14.250 20:22:06 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:14.250 20:22:06 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:14.250 20:22:06 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:14.250 20:22:06 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:14.250 20:22:06 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:08:14.250 20:22:06 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:08:14.250 [2024-07-15 20:22:06.493285] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:14.250 [2024-07-15 20:22:06.493351] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1322098 ] 00:08:14.250 [2024-07-15 20:22:06.621762] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.508 [2024-07-15 20:22:06.722999] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:14.508 20:22:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:14.509 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:14.509 20:22:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:08:15.883 20:22:07 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:15.883 00:08:15.883 real 0m1.513s 00:08:15.883 user 0m1.317s 00:08:15.883 sys 0m0.195s 00:08:15.883 20:22:07 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:15.883 20:22:07 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:08:15.883 ************************************ 00:08:15.883 END TEST accel_compare 00:08:15.883 ************************************ 00:08:15.883 20:22:08 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:15.883 20:22:08 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:08:15.883 20:22:08 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:15.883 20:22:08 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:15.883 20:22:08 accel -- common/autotest_common.sh@10 -- # set +x 00:08:15.883 ************************************ 00:08:15.883 START TEST accel_xor 00:08:15.883 ************************************ 00:08:15.883 20:22:08 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:15.883 20:22:08 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:15.883 [2024-07-15 20:22:08.085378] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:15.883 [2024-07-15 20:22:08.085440] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1322297 ] 00:08:15.883 [2024-07-15 20:22:08.204087] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.142 [2024-07-15 20:22:08.305682] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:16.142 20:22:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:17.516 00:08:17.516 real 0m1.504s 00:08:17.516 user 0m1.314s 00:08:17.516 sys 0m0.187s 00:08:17.516 20:22:09 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:17.516 20:22:09 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:17.516 ************************************ 00:08:17.516 END TEST accel_xor 00:08:17.516 ************************************ 00:08:17.516 20:22:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:17.516 20:22:09 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:17.516 20:22:09 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:17.516 20:22:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:17.516 20:22:09 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.516 ************************************ 00:08:17.516 START TEST accel_xor 00:08:17.516 ************************************ 00:08:17.516 20:22:09 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:17.516 20:22:09 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:17.516 [2024-07-15 20:22:09.666325] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:17.516 [2024-07-15 20:22:09.666399] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1322495 ] 00:08:17.516 [2024-07-15 20:22:09.809528] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.774 [2024-07-15 20:22:09.912525] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.774 20:22:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:19.147 20:22:11 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:19.147 00:08:19.147 real 0m1.507s 00:08:19.148 user 0m1.309s 00:08:19.148 sys 0m0.198s 00:08:19.148 20:22:11 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:19.148 20:22:11 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:19.148 ************************************ 00:08:19.148 END TEST accel_xor 00:08:19.148 ************************************ 00:08:19.148 20:22:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:19.148 20:22:11 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:19.148 20:22:11 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:19.148 20:22:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:19.148 20:22:11 accel -- common/autotest_common.sh@10 -- # set +x 00:08:19.148 ************************************ 00:08:19.148 START TEST accel_dif_verify 00:08:19.148 ************************************ 00:08:19.148 20:22:11 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:19.148 20:22:11 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:08:19.148 [2024-07-15 20:22:11.275451] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:19.148 [2024-07-15 20:22:11.275577] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1322696 ] 00:08:19.148 [2024-07-15 20:22:11.474215] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.406 [2024-07-15 20:22:11.580264] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:19.406 20:22:11 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.779 20:22:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.779 20:22:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.779 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.779 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.779 20:22:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.779 20:22:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.779 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.779 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.779 20:22:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:08:20.780 20:22:12 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:20.780 00:08:20.780 real 0m1.601s 00:08:20.780 user 0m1.353s 00:08:20.780 sys 0m0.249s 00:08:20.780 20:22:12 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:20.780 20:22:12 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:08:20.780 ************************************ 00:08:20.780 END TEST accel_dif_verify 00:08:20.780 ************************************ 00:08:20.780 20:22:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:20.780 20:22:12 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:20.780 20:22:12 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:20.780 20:22:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.780 20:22:12 accel -- common/autotest_common.sh@10 -- # set +x 00:08:20.780 ************************************ 00:08:20.780 START TEST accel_dif_generate 00:08:20.780 ************************************ 00:08:20.780 20:22:12 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:08:20.780 20:22:12 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:08:20.780 [2024-07-15 20:22:12.945788] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:20.780 [2024-07-15 20:22:12.945854] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1322998 ] 00:08:20.780 [2024-07-15 20:22:13.078048] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.038 [2024-07-15 20:22:13.184116] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:08:21.038 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.039 20:22:13 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:22.409 20:22:14 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:22.409 00:08:22.409 real 0m1.520s 00:08:22.409 user 0m1.334s 00:08:22.409 sys 0m0.191s 00:08:22.409 20:22:14 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:22.409 20:22:14 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:22.409 ************************************ 00:08:22.409 END TEST accel_dif_generate 00:08:22.409 ************************************ 00:08:22.409 20:22:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:22.409 20:22:14 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:22.409 20:22:14 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:22.409 20:22:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:22.409 20:22:14 accel -- common/autotest_common.sh@10 -- # set +x 00:08:22.409 ************************************ 00:08:22.409 START TEST accel_dif_generate_copy 00:08:22.409 ************************************ 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:22.409 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:22.409 [2024-07-15 20:22:14.551610] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:22.410 [2024-07-15 20:22:14.551673] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1323246 ] 00:08:22.410 [2024-07-15 20:22:14.681605] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.410 [2024-07-15 20:22:14.778791] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:22.668 20:22:14 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:23.670 00:08:23.670 real 0m1.497s 00:08:23.670 user 0m1.312s 00:08:23.670 sys 0m0.183s 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:23.670 20:22:16 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:23.670 ************************************ 00:08:23.670 END TEST accel_dif_generate_copy 00:08:23.670 ************************************ 00:08:23.929 20:22:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:23.929 20:22:16 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:23.929 20:22:16 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:23.929 20:22:16 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:23.929 20:22:16 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.929 20:22:16 accel -- common/autotest_common.sh@10 -- # set +x 00:08:23.929 ************************************ 00:08:23.929 START TEST accel_comp 00:08:23.929 ************************************ 00:08:23.929 20:22:16 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:23.929 20:22:16 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:23.929 [2024-07-15 20:22:16.141699] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:23.929 [2024-07-15 20:22:16.141826] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1323454 ] 00:08:24.187 [2024-07-15 20:22:16.337830] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.187 [2024-07-15 20:22:16.442872] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:24.187 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.188 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.188 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.188 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:24.188 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.188 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.188 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.188 20:22:16 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:24.188 20:22:16 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:24.188 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.188 20:22:16 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:25.570 20:22:17 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:25.570 00:08:25.570 real 0m1.600s 00:08:25.570 user 0m1.351s 00:08:25.570 sys 0m0.252s 00:08:25.570 20:22:17 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:25.570 20:22:17 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:25.570 ************************************ 00:08:25.570 END TEST accel_comp 00:08:25.570 ************************************ 00:08:25.570 20:22:17 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:25.570 20:22:17 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:25.570 20:22:17 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:25.570 20:22:17 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:25.570 20:22:17 accel -- common/autotest_common.sh@10 -- # set +x 00:08:25.570 ************************************ 00:08:25.570 START TEST accel_decomp 00:08:25.570 ************************************ 00:08:25.570 20:22:17 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:25.570 20:22:17 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:25.570 [2024-07-15 20:22:17.811563] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:25.570 [2024-07-15 20:22:17.811629] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1323647 ] 00:08:25.570 [2024-07-15 20:22:17.941459] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.828 [2024-07-15 20:22:18.042654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.828 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:25.828 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.828 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.828 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.828 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:25.828 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:25.829 20:22:18 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:27.204 20:22:19 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:27.204 00:08:27.204 real 0m1.508s 00:08:27.204 user 0m1.318s 00:08:27.204 sys 0m0.195s 00:08:27.204 20:22:19 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:27.204 20:22:19 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:27.204 ************************************ 00:08:27.204 END TEST accel_decomp 00:08:27.204 ************************************ 00:08:27.204 20:22:19 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:27.204 20:22:19 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:27.204 20:22:19 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:27.204 20:22:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:27.204 20:22:19 accel -- common/autotest_common.sh@10 -- # set +x 00:08:27.204 ************************************ 00:08:27.204 START TEST accel_decomp_full 00:08:27.204 ************************************ 00:08:27.204 20:22:19 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:27.204 20:22:19 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:27.204 [2024-07-15 20:22:19.395596] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:27.204 [2024-07-15 20:22:19.395658] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1323847 ] 00:08:27.204 [2024-07-15 20:22:19.525355] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.464 [2024-07-15 20:22:19.631021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:27.464 20:22:19 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.844 20:22:20 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.845 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.845 20:22:20 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.845 20:22:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:28.845 20:22:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:28.845 20:22:20 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:28.845 00:08:28.845 real 0m1.525s 00:08:28.845 user 0m1.344s 00:08:28.845 sys 0m0.184s 00:08:28.845 20:22:20 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:28.845 20:22:20 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:28.845 ************************************ 00:08:28.845 END TEST accel_decomp_full 00:08:28.845 ************************************ 00:08:28.845 20:22:20 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:28.845 20:22:20 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:28.845 20:22:20 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:28.845 20:22:20 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:28.845 20:22:20 accel -- common/autotest_common.sh@10 -- # set +x 00:08:28.845 ************************************ 00:08:28.845 START TEST accel_decomp_mcore 00:08:28.845 ************************************ 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:28.845 20:22:20 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:28.845 [2024-07-15 20:22:21.002402] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:28.845 [2024-07-15 20:22:21.002465] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1324121 ] 00:08:28.845 [2024-07-15 20:22:21.131603] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:29.105 [2024-07-15 20:22:21.236802] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:29.105 [2024-07-15 20:22:21.236907] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:29.105 [2024-07-15 20:22:21.237010] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:29.105 [2024-07-15 20:22:21.237012] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.105 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.106 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.106 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:29.106 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:29.106 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.106 20:22:21 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.501 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:30.502 00:08:30.502 real 0m1.521s 00:08:30.502 user 0m4.753s 00:08:30.502 sys 0m0.215s 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:30.502 20:22:22 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:30.502 ************************************ 00:08:30.502 END TEST accel_decomp_mcore 00:08:30.502 ************************************ 00:08:30.502 20:22:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:30.502 20:22:22 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:30.502 20:22:22 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:30.502 20:22:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.502 20:22:22 accel -- common/autotest_common.sh@10 -- # set +x 00:08:30.502 ************************************ 00:08:30.502 START TEST accel_decomp_full_mcore 00:08:30.502 ************************************ 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:30.502 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:30.502 [2024-07-15 20:22:22.603531] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:30.502 [2024-07-15 20:22:22.603596] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1324405 ] 00:08:30.502 [2024-07-15 20:22:22.735861] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:30.502 [2024-07-15 20:22:22.845684] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:30.502 [2024-07-15 20:22:22.845786] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:30.502 [2024-07-15 20:22:22.845886] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:30.502 [2024-07-15 20:22:22.845887] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.762 20:22:22 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.142 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:32.143 00:08:32.143 real 0m1.564s 00:08:32.143 user 0m4.890s 00:08:32.143 sys 0m0.223s 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:32.143 20:22:24 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:32.143 ************************************ 00:08:32.143 END TEST accel_decomp_full_mcore 00:08:32.143 ************************************ 00:08:32.143 20:22:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:32.143 20:22:24 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:32.143 20:22:24 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:32.143 20:22:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:32.143 20:22:24 accel -- common/autotest_common.sh@10 -- # set +x 00:08:32.143 ************************************ 00:08:32.143 START TEST accel_decomp_mthread 00:08:32.143 ************************************ 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:32.143 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:32.143 [2024-07-15 20:22:24.259190] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:32.143 [2024-07-15 20:22:24.259315] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1324607 ] 00:08:32.143 [2024-07-15 20:22:24.456799] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.403 [2024-07-15 20:22:24.563333] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:32.403 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:32.404 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:32.404 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:32.404 20:22:24 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.782 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.782 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.782 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.782 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.782 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.782 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.782 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.782 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.782 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:33.783 00:08:33.783 real 0m1.596s 00:08:33.783 user 0m1.337s 00:08:33.783 sys 0m0.265s 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:33.783 20:22:25 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:33.783 ************************************ 00:08:33.783 END TEST accel_decomp_mthread 00:08:33.783 ************************************ 00:08:33.783 20:22:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:33.783 20:22:25 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:33.783 20:22:25 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:33.783 20:22:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.783 20:22:25 accel -- common/autotest_common.sh@10 -- # set +x 00:08:33.783 ************************************ 00:08:33.783 START TEST accel_decomp_full_mthread 00:08:33.783 ************************************ 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:33.783 20:22:25 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:33.783 [2024-07-15 20:22:25.923144] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:33.783 [2024-07-15 20:22:25.923208] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1324805 ] 00:08:33.783 [2024-07-15 20:22:26.051209] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.783 [2024-07-15 20:22:26.148399] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:34.042 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.043 20:22:26 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:35.421 00:08:35.421 real 0m1.522s 00:08:35.421 user 0m1.325s 00:08:35.421 sys 0m0.202s 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:35.421 20:22:27 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:35.421 ************************************ 00:08:35.421 END TEST accel_decomp_full_mthread 00:08:35.421 ************************************ 00:08:35.421 20:22:27 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:35.421 20:22:27 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:08:35.421 20:22:27 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:08:35.421 20:22:27 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:08:35.421 20:22:27 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:35.421 20:22:27 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1325002 00:08:35.421 20:22:27 accel -- accel/accel.sh@63 -- # waitforlisten 1325002 00:08:35.421 20:22:27 accel -- common/autotest_common.sh@829 -- # '[' -z 1325002 ']' 00:08:35.421 20:22:27 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:35.421 20:22:27 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:35.421 20:22:27 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:35.421 20:22:27 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:35.421 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:35.421 20:22:27 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:35.421 20:22:27 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:35.421 20:22:27 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:35.421 20:22:27 accel -- common/autotest_common.sh@10 -- # set +x 00:08:35.421 20:22:27 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:35.421 20:22:27 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:35.421 20:22:27 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:35.421 20:22:27 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:35.421 20:22:27 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:35.421 20:22:27 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:35.421 20:22:27 accel -- accel/accel.sh@41 -- # jq -r . 00:08:35.421 [2024-07-15 20:22:27.522902] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:35.421 [2024-07-15 20:22:27.522976] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1325002 ] 00:08:35.421 [2024-07-15 20:22:27.640858] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.421 [2024-07-15 20:22:27.738784] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.358 [2024-07-15 20:22:28.501758] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:36.358 20:22:28 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:36.358 20:22:28 accel -- common/autotest_common.sh@862 -- # return 0 00:08:36.358 20:22:28 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:36.358 20:22:28 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:36.358 20:22:28 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:36.358 20:22:28 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:36.358 20:22:28 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:36.358 20:22:28 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:36.358 20:22:28 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:36.358 20:22:28 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:36.358 20:22:28 accel -- common/autotest_common.sh@10 -- # set +x 00:08:36.358 20:22:28 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:36.617 20:22:28 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:36.617 "method": "compressdev_scan_accel_module", 00:08:36.617 20:22:28 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:36.617 20:22:28 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:36.617 20:22:28 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:36.617 20:22:28 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:36.617 20:22:28 accel -- common/autotest_common.sh@10 -- # set +x 00:08:36.617 20:22:28 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # IFS== 00:08:36.617 20:22:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:36.617 20:22:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:36.617 20:22:28 accel -- accel/accel.sh@75 -- # killprocess 1325002 00:08:36.617 20:22:28 accel -- common/autotest_common.sh@948 -- # '[' -z 1325002 ']' 00:08:36.617 20:22:28 accel -- common/autotest_common.sh@952 -- # kill -0 1325002 00:08:36.617 20:22:28 accel -- common/autotest_common.sh@953 -- # uname 00:08:36.617 20:22:28 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:36.617 20:22:28 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1325002 00:08:36.875 20:22:29 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:36.876 20:22:29 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:36.876 20:22:29 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1325002' 00:08:36.876 killing process with pid 1325002 00:08:36.876 20:22:29 accel -- common/autotest_common.sh@967 -- # kill 1325002 00:08:36.876 20:22:29 accel -- common/autotest_common.sh@972 -- # wait 1325002 00:08:37.134 20:22:29 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:37.134 20:22:29 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:37.134 20:22:29 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:37.134 20:22:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.134 20:22:29 accel -- common/autotest_common.sh@10 -- # set +x 00:08:37.134 ************************************ 00:08:37.134 START TEST accel_cdev_comp 00:08:37.134 ************************************ 00:08:37.134 20:22:29 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:37.134 20:22:29 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:37.134 [2024-07-15 20:22:29.485606] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:37.134 [2024-07-15 20:22:29.485664] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1325362 ] 00:08:37.393 [2024-07-15 20:22:29.612853] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.393 [2024-07-15 20:22:29.712428] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.330 [2024-07-15 20:22:30.492525] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:38.330 [2024-07-15 20:22:30.495155] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2725080 PMD being used: compress_qat 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 [2024-07-15 20:22:30.499261] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2729e60 PMD being used: compress_qat 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.330 20:22:30 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:39.708 20:22:31 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:39.708 00:08:39.708 real 0m2.229s 00:08:39.708 user 0m1.644s 00:08:39.708 sys 0m0.587s 00:08:39.708 20:22:31 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.708 20:22:31 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:39.708 ************************************ 00:08:39.708 END TEST accel_cdev_comp 00:08:39.708 ************************************ 00:08:39.708 20:22:31 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:39.708 20:22:31 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:39.708 20:22:31 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:39.708 20:22:31 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.708 20:22:31 accel -- common/autotest_common.sh@10 -- # set +x 00:08:39.708 ************************************ 00:08:39.708 START TEST accel_cdev_decomp 00:08:39.708 ************************************ 00:08:39.708 20:22:31 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:39.708 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:39.708 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:39.708 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:39.708 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:39.708 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:39.708 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:39.708 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:39.708 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:39.708 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:39.709 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:39.709 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:39.709 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:39.709 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:39.709 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:39.709 20:22:31 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:39.709 [2024-07-15 20:22:31.800405] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:39.709 [2024-07-15 20:22:31.800469] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1325602 ] 00:08:39.709 [2024-07-15 20:22:31.931293] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.709 [2024-07-15 20:22:32.032786] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.737 [2024-07-15 20:22:32.812446] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:40.737 [2024-07-15 20:22:32.815072] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1497080 PMD being used: compress_qat 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 [2024-07-15 20:22:32.819273] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x149be60 PMD being used: compress_qat 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.737 20:22:32 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.676 20:22:33 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.676 20:22:34 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.676 20:22:34 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:41.676 20:22:34 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:41.676 20:22:34 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:41.676 00:08:41.676 real 0m2.237s 00:08:41.676 user 0m1.640s 00:08:41.676 sys 0m0.592s 00:08:41.676 20:22:34 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:41.676 20:22:34 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:41.676 ************************************ 00:08:41.676 END TEST accel_cdev_decomp 00:08:41.676 ************************************ 00:08:41.676 20:22:34 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:41.676 20:22:34 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:41.676 20:22:34 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:41.676 20:22:34 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:41.676 20:22:34 accel -- common/autotest_common.sh@10 -- # set +x 00:08:41.936 ************************************ 00:08:41.936 START TEST accel_cdev_decomp_full 00:08:41.936 ************************************ 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:41.936 20:22:34 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:41.936 [2024-07-15 20:22:34.119492] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:41.936 [2024-07-15 20:22:34.119553] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1325933 ] 00:08:41.936 [2024-07-15 20:22:34.249011] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.195 [2024-07-15 20:22:34.350819] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.762 [2024-07-15 20:22:35.129005] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:42.762 [2024-07-15 20:22:35.131611] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1313080 PMD being used: compress_qat 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:42.762 [2024-07-15 20:22:35.134980] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1312ce0 PMD being used: compress_qat 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:42.762 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.021 20:22:35 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:43.957 00:08:43.957 real 0m2.234s 00:08:43.957 user 0m1.649s 00:08:43.957 sys 0m0.583s 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:43.957 20:22:36 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:43.957 ************************************ 00:08:43.957 END TEST accel_cdev_decomp_full 00:08:43.957 ************************************ 00:08:44.215 20:22:36 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:44.215 20:22:36 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:44.215 20:22:36 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:44.215 20:22:36 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:44.215 20:22:36 accel -- common/autotest_common.sh@10 -- # set +x 00:08:44.215 ************************************ 00:08:44.215 START TEST accel_cdev_decomp_mcore 00:08:44.215 ************************************ 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:44.215 20:22:36 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:44.215 [2024-07-15 20:22:36.437836] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:44.215 [2024-07-15 20:22:36.437905] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1326296 ] 00:08:44.215 [2024-07-15 20:22:36.571178] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:44.474 [2024-07-15 20:22:36.684650] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:44.474 [2024-07-15 20:22:36.684751] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:44.474 [2024-07-15 20:22:36.684854] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:44.474 [2024-07-15 20:22:36.684854] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.410 [2024-07-15 20:22:37.441862] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:45.410 [2024-07-15 20:22:37.444499] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b68720 PMD being used: compress_qat 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:45.410 [2024-07-15 20:22:37.450529] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8f1c19b8b0 PMD being used: compress_qat 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.410 [2024-07-15 20:22:37.452342] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b6d9f0 PMD being used: compress_qat 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.410 [2024-07-15 20:22:37.456171] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8f1419b8b0 PMD being used: compress_qat 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 [2024-07-15 20:22:37.456420] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8f0c19b8b0 PMD being used: compress_qat 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.410 20:22:37 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:46.347 00:08:46.347 real 0m2.252s 00:08:46.347 user 0m7.221s 00:08:46.347 sys 0m0.614s 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:46.347 20:22:38 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:46.347 ************************************ 00:08:46.347 END TEST accel_cdev_decomp_mcore 00:08:46.347 ************************************ 00:08:46.347 20:22:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:46.347 20:22:38 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:46.347 20:22:38 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:46.347 20:22:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:46.347 20:22:38 accel -- common/autotest_common.sh@10 -- # set +x 00:08:46.607 ************************************ 00:08:46.607 START TEST accel_cdev_decomp_full_mcore 00:08:46.607 ************************************ 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:46.607 20:22:38 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:46.607 [2024-07-15 20:22:38.770040] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:46.607 [2024-07-15 20:22:38.770101] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1326611 ] 00:08:46.607 [2024-07-15 20:22:38.900381] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:46.866 [2024-07-15 20:22:39.005919] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:46.866 [2024-07-15 20:22:39.006020] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:46.866 [2024-07-15 20:22:39.006062] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.866 [2024-07-15 20:22:39.006061] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:47.436 [2024-07-15 20:22:39.763673] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:47.436 [2024-07-15 20:22:39.766281] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf72720 PMD being used: compress_qat 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:47.436 [2024-07-15 20:22:39.771388] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f329c19b8b0 PMD being used: compress_qat 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 [2024-07-15 20:22:39.773168] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf75a30 PMD being used: compress_qat 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 [2024-07-15 20:22:39.777008] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f329419b8b0 PMD being used: compress_qat 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:47.436 [2024-07-15 20:22:39.777284] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f328c19b8b0 PMD being used: compress_qat 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.436 20:22:39 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:48.814 00:08:48.814 real 0m2.235s 00:08:48.814 user 0m7.201s 00:08:48.814 sys 0m0.597s 00:08:48.814 20:22:40 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:48.815 20:22:40 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:48.815 ************************************ 00:08:48.815 END TEST accel_cdev_decomp_full_mcore 00:08:48.815 ************************************ 00:08:48.815 20:22:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:48.815 20:22:41 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:48.815 20:22:41 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:48.815 20:22:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:48.815 20:22:41 accel -- common/autotest_common.sh@10 -- # set +x 00:08:48.815 ************************************ 00:08:48.815 START TEST accel_cdev_decomp_mthread 00:08:48.815 ************************************ 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:48.815 20:22:41 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:48.815 [2024-07-15 20:22:41.087150] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:48.815 [2024-07-15 20:22:41.087213] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1326878 ] 00:08:49.074 [2024-07-15 20:22:41.215907] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.074 [2024-07-15 20:22:41.317037] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.010 [2024-07-15 20:22:42.084079] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:50.010 [2024-07-15 20:22:42.086629] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe6b080 PMD being used: compress_qat 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.010 [2024-07-15 20:22:42.091533] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe702a0 PMD being used: compress_qat 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.010 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.011 [2024-07-15 20:22:42.094045] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf930f0 PMD being used: compress_qat 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.011 20:22:42 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.948 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:50.949 00:08:50.949 real 0m2.210s 00:08:50.949 user 0m1.619s 00:08:50.949 sys 0m0.585s 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:50.949 20:22:43 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:50.949 ************************************ 00:08:50.949 END TEST accel_cdev_decomp_mthread 00:08:50.949 ************************************ 00:08:50.949 20:22:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:50.949 20:22:43 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:50.949 20:22:43 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:50.949 20:22:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:50.949 20:22:43 accel -- common/autotest_common.sh@10 -- # set +x 00:08:51.207 ************************************ 00:08:51.207 START TEST accel_cdev_decomp_full_mthread 00:08:51.207 ************************************ 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:51.207 20:22:43 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:51.207 [2024-07-15 20:22:43.394174] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:51.207 [2024-07-15 20:22:43.394301] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1327240 ] 00:08:51.466 [2024-07-15 20:22:43.592601] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:51.466 [2024-07-15 20:22:43.697633] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.404 [2024-07-15 20:22:44.465877] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:52.404 [2024-07-15 20:22:44.468425] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa84080 PMD being used: compress_qat 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:52.404 [2024-07-15 20:22:44.472433] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa873b0 PMD being used: compress_qat 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:52.404 [2024-07-15 20:22:44.475280] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xbabcc0 PMD being used: compress_qat 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.404 20:22:44 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:53.343 00:08:53.343 real 0m2.317s 00:08:53.343 user 0m1.682s 00:08:53.343 sys 0m0.635s 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:53.343 20:22:45 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:53.343 ************************************ 00:08:53.343 END TEST accel_cdev_decomp_full_mthread 00:08:53.343 ************************************ 00:08:53.343 20:22:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:53.343 20:22:45 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:53.343 20:22:45 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:53.343 20:22:45 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:53.343 20:22:45 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:53.343 20:22:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:53.343 20:22:45 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:53.343 20:22:45 accel -- common/autotest_common.sh@10 -- # set +x 00:08:53.343 20:22:45 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:53.343 20:22:45 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:53.343 20:22:45 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:53.343 20:22:45 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:53.343 20:22:45 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:53.343 20:22:45 accel -- accel/accel.sh@41 -- # jq -r . 00:08:53.602 ************************************ 00:08:53.602 START TEST accel_dif_functional_tests 00:08:53.602 ************************************ 00:08:53.602 20:22:45 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:53.602 [2024-07-15 20:22:45.787372] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:53.602 [2024-07-15 20:22:45.787417] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1327611 ] 00:08:53.602 [2024-07-15 20:22:45.901820] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:53.861 [2024-07-15 20:22:46.013193] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:53.861 [2024-07-15 20:22:46.013297] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:53.861 [2024-07-15 20:22:46.013299] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.861 00:08:53.861 00:08:53.861 CUnit - A unit testing framework for C - Version 2.1-3 00:08:53.861 http://cunit.sourceforge.net/ 00:08:53.861 00:08:53.861 00:08:53.861 Suite: accel_dif 00:08:53.861 Test: verify: DIF generated, GUARD check ...passed 00:08:53.861 Test: verify: DIF generated, APPTAG check ...passed 00:08:53.861 Test: verify: DIF generated, REFTAG check ...passed 00:08:53.861 Test: verify: DIF not generated, GUARD check ...[2024-07-15 20:22:46.114954] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:53.861 passed 00:08:53.861 Test: verify: DIF not generated, APPTAG check ...[2024-07-15 20:22:46.115031] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:53.861 passed 00:08:53.861 Test: verify: DIF not generated, REFTAG check ...[2024-07-15 20:22:46.115069] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:53.861 passed 00:08:53.861 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:53.861 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-15 20:22:46.115150] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:53.861 passed 00:08:53.861 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:53.861 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:53.861 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:53.861 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-15 20:22:46.115320] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:53.861 passed 00:08:53.861 Test: verify copy: DIF generated, GUARD check ...passed 00:08:53.861 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:53.861 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:53.861 Test: verify copy: DIF not generated, GUARD check ...[2024-07-15 20:22:46.115503] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:53.861 passed 00:08:53.861 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-15 20:22:46.115546] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:53.861 passed 00:08:53.861 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-15 20:22:46.115587] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:53.861 passed 00:08:53.861 Test: generate copy: DIF generated, GUARD check ...passed 00:08:53.861 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:53.861 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:53.861 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:53.861 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:53.861 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:53.861 Test: generate copy: iovecs-len validate ...[2024-07-15 20:22:46.115873] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:53.861 passed 00:08:53.861 Test: generate copy: buffer alignment validate ...passed 00:08:53.861 00:08:53.861 Run Summary: Type Total Ran Passed Failed Inactive 00:08:53.861 suites 1 1 n/a 0 0 00:08:53.861 tests 26 26 26 0 0 00:08:53.861 asserts 115 115 115 0 n/a 00:08:53.861 00:08:53.861 Elapsed time = 0.005 seconds 00:08:54.121 00:08:54.121 real 0m0.589s 00:08:54.121 user 0m0.767s 00:08:54.121 sys 0m0.226s 00:08:54.121 20:22:46 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:54.121 20:22:46 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:54.121 ************************************ 00:08:54.121 END TEST accel_dif_functional_tests 00:08:54.121 ************************************ 00:08:54.121 20:22:46 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:54.121 00:08:54.121 real 0m54.100s 00:08:54.121 user 1m2.008s 00:08:54.121 sys 0m12.251s 00:08:54.121 20:22:46 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:54.121 20:22:46 accel -- common/autotest_common.sh@10 -- # set +x 00:08:54.121 ************************************ 00:08:54.121 END TEST accel 00:08:54.121 ************************************ 00:08:54.121 20:22:46 -- common/autotest_common.sh@1142 -- # return 0 00:08:54.121 20:22:46 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:54.121 20:22:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:54.121 20:22:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:54.121 20:22:46 -- common/autotest_common.sh@10 -- # set +x 00:08:54.121 ************************************ 00:08:54.121 START TEST accel_rpc 00:08:54.121 ************************************ 00:08:54.121 20:22:46 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:54.380 * Looking for test storage... 00:08:54.380 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:54.380 20:22:46 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:54.380 20:22:46 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1327676 00:08:54.380 20:22:46 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1327676 00:08:54.380 20:22:46 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:54.380 20:22:46 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 1327676 ']' 00:08:54.380 20:22:46 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:54.380 20:22:46 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:54.380 20:22:46 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:54.380 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:54.380 20:22:46 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:54.380 20:22:46 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:54.380 [2024-07-15 20:22:46.649272] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:54.380 [2024-07-15 20:22:46.649347] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1327676 ] 00:08:54.638 [2024-07-15 20:22:46.772998] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:54.638 [2024-07-15 20:22:46.870332] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.206 20:22:47 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:55.206 20:22:47 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:55.206 20:22:47 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:55.206 20:22:47 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:55.206 20:22:47 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:55.206 20:22:47 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:55.206 20:22:47 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:55.206 20:22:47 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:55.206 20:22:47 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:55.206 20:22:47 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:55.543 ************************************ 00:08:55.543 START TEST accel_assign_opcode 00:08:55.543 ************************************ 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:55.543 [2024-07-15 20:22:47.616708] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:55.543 [2024-07-15 20:22:47.628729] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:55.543 20:22:47 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:55.814 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:55.814 software 00:08:55.814 00:08:55.814 real 0m0.305s 00:08:55.814 user 0m0.048s 00:08:55.814 sys 0m0.015s 00:08:55.814 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:55.814 20:22:47 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:55.814 ************************************ 00:08:55.814 END TEST accel_assign_opcode 00:08:55.814 ************************************ 00:08:55.814 20:22:47 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:55.814 20:22:47 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1327676 00:08:55.814 20:22:47 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 1327676 ']' 00:08:55.814 20:22:47 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 1327676 00:08:55.814 20:22:47 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:55.814 20:22:47 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:55.814 20:22:47 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1327676 00:08:55.814 20:22:48 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:55.814 20:22:48 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:55.814 20:22:48 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1327676' 00:08:55.814 killing process with pid 1327676 00:08:55.814 20:22:48 accel_rpc -- common/autotest_common.sh@967 -- # kill 1327676 00:08:55.814 20:22:48 accel_rpc -- common/autotest_common.sh@972 -- # wait 1327676 00:08:56.073 00:08:56.073 real 0m1.930s 00:08:56.073 user 0m1.969s 00:08:56.073 sys 0m0.617s 00:08:56.073 20:22:48 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:56.073 20:22:48 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:56.073 ************************************ 00:08:56.073 END TEST accel_rpc 00:08:56.073 ************************************ 00:08:56.073 20:22:48 -- common/autotest_common.sh@1142 -- # return 0 00:08:56.073 20:22:48 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:56.073 20:22:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:56.073 20:22:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:56.073 20:22:48 -- common/autotest_common.sh@10 -- # set +x 00:08:56.330 ************************************ 00:08:56.330 START TEST app_cmdline 00:08:56.330 ************************************ 00:08:56.330 20:22:48 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:56.330 * Looking for test storage... 00:08:56.330 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:56.330 20:22:48 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:56.330 20:22:48 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1328097 00:08:56.330 20:22:48 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1328097 00:08:56.330 20:22:48 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:56.330 20:22:48 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 1328097 ']' 00:08:56.330 20:22:48 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:56.331 20:22:48 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:56.331 20:22:48 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:56.331 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:56.331 20:22:48 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:56.331 20:22:48 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:56.331 [2024-07-15 20:22:48.670747] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:56.331 [2024-07-15 20:22:48.670823] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1328097 ] 00:08:56.588 [2024-07-15 20:22:48.797474] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.588 [2024-07-15 20:22:48.900792] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.522 20:22:49 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:57.522 20:22:49 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:57.522 20:22:49 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:58.089 { 00:08:58.089 "version": "SPDK v24.09-pre git sha1 6c0846996", 00:08:58.089 "fields": { 00:08:58.089 "major": 24, 00:08:58.089 "minor": 9, 00:08:58.089 "patch": 0, 00:08:58.089 "suffix": "-pre", 00:08:58.089 "commit": "6c0846996" 00:08:58.089 } 00:08:58.089 } 00:08:58.089 20:22:50 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:58.089 20:22:50 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:58.089 20:22:50 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:58.089 20:22:50 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:58.089 20:22:50 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:58.089 20:22:50 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:58.089 20:22:50 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:58.089 20:22:50 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:58.089 20:22:50 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:58.089 20:22:50 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:58.089 20:22:50 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:58.347 request: 00:08:58.347 { 00:08:58.347 "method": "env_dpdk_get_mem_stats", 00:08:58.347 "req_id": 1 00:08:58.347 } 00:08:58.347 Got JSON-RPC error response 00:08:58.347 response: 00:08:58.347 { 00:08:58.347 "code": -32601, 00:08:58.347 "message": "Method not found" 00:08:58.347 } 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:58.347 20:22:50 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1328097 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 1328097 ']' 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 1328097 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1328097 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1328097' 00:08:58.347 killing process with pid 1328097 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@967 -- # kill 1328097 00:08:58.347 20:22:50 app_cmdline -- common/autotest_common.sh@972 -- # wait 1328097 00:08:58.914 00:08:58.914 real 0m2.550s 00:08:58.914 user 0m3.385s 00:08:58.914 sys 0m0.676s 00:08:58.914 20:22:51 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:58.914 20:22:51 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:58.914 ************************************ 00:08:58.914 END TEST app_cmdline 00:08:58.914 ************************************ 00:08:58.914 20:22:51 -- common/autotest_common.sh@1142 -- # return 0 00:08:58.914 20:22:51 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:58.914 20:22:51 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:58.914 20:22:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:58.914 20:22:51 -- common/autotest_common.sh@10 -- # set +x 00:08:58.914 ************************************ 00:08:58.914 START TEST version 00:08:58.914 ************************************ 00:08:58.914 20:22:51 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:58.914 * Looking for test storage... 00:08:58.914 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:58.914 20:22:51 version -- app/version.sh@17 -- # get_header_version major 00:08:58.914 20:22:51 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:58.914 20:22:51 version -- app/version.sh@14 -- # cut -f2 00:08:58.914 20:22:51 version -- app/version.sh@14 -- # tr -d '"' 00:08:58.914 20:22:51 version -- app/version.sh@17 -- # major=24 00:08:58.914 20:22:51 version -- app/version.sh@18 -- # get_header_version minor 00:08:58.914 20:22:51 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:58.914 20:22:51 version -- app/version.sh@14 -- # cut -f2 00:08:58.914 20:22:51 version -- app/version.sh@14 -- # tr -d '"' 00:08:58.914 20:22:51 version -- app/version.sh@18 -- # minor=9 00:08:58.914 20:22:51 version -- app/version.sh@19 -- # get_header_version patch 00:08:58.914 20:22:51 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:58.914 20:22:51 version -- app/version.sh@14 -- # cut -f2 00:08:58.914 20:22:51 version -- app/version.sh@14 -- # tr -d '"' 00:08:58.914 20:22:51 version -- app/version.sh@19 -- # patch=0 00:08:58.914 20:22:51 version -- app/version.sh@20 -- # get_header_version suffix 00:08:58.914 20:22:51 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:58.914 20:22:51 version -- app/version.sh@14 -- # tr -d '"' 00:08:58.914 20:22:51 version -- app/version.sh@14 -- # cut -f2 00:08:58.914 20:22:51 version -- app/version.sh@20 -- # suffix=-pre 00:08:58.914 20:22:51 version -- app/version.sh@22 -- # version=24.9 00:08:58.914 20:22:51 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:58.914 20:22:51 version -- app/version.sh@28 -- # version=24.9rc0 00:08:58.914 20:22:51 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:58.914 20:22:51 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:59.173 20:22:51 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:59.173 20:22:51 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:59.173 00:08:59.173 real 0m0.192s 00:08:59.173 user 0m0.079s 00:08:59.173 sys 0m0.159s 00:08:59.173 20:22:51 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:59.173 20:22:51 version -- common/autotest_common.sh@10 -- # set +x 00:08:59.173 ************************************ 00:08:59.173 END TEST version 00:08:59.173 ************************************ 00:08:59.173 20:22:51 -- common/autotest_common.sh@1142 -- # return 0 00:08:59.173 20:22:51 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:59.173 20:22:51 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:59.173 20:22:51 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:59.173 20:22:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:59.173 20:22:51 -- common/autotest_common.sh@10 -- # set +x 00:08:59.173 ************************************ 00:08:59.173 START TEST blockdev_general 00:08:59.173 ************************************ 00:08:59.173 20:22:51 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:59.173 * Looking for test storage... 00:08:59.173 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:59.173 20:22:51 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1328574 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:59.173 20:22:51 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1328574 00:08:59.173 20:22:51 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 1328574 ']' 00:08:59.173 20:22:51 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:59.173 20:22:51 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:59.173 20:22:51 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:59.173 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:59.173 20:22:51 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:59.173 20:22:51 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:59.432 [2024-07-15 20:22:51.597180] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:08:59.432 [2024-07-15 20:22:51.597254] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1328574 ] 00:08:59.432 [2024-07-15 20:22:51.725431] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.690 [2024-07-15 20:22:51.830210] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.626 20:22:52 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:00.626 20:22:52 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:09:00.626 20:22:52 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:09:00.626 20:22:52 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:09:00.626 20:22:52 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:09:00.626 20:22:52 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:00.626 20:22:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:00.885 [2024-07-15 20:22:53.039216] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:00.885 [2024-07-15 20:22:53.039268] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:00.885 00:09:00.885 [2024-07-15 20:22:53.047205] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:00.885 [2024-07-15 20:22:53.047230] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:00.885 00:09:00.885 Malloc0 00:09:00.885 Malloc1 00:09:00.885 Malloc2 00:09:00.885 Malloc3 00:09:00.885 Malloc4 00:09:00.885 Malloc5 00:09:00.885 Malloc6 00:09:00.885 Malloc7 00:09:00.885 Malloc8 00:09:00.885 Malloc9 00:09:00.885 [2024-07-15 20:22:53.195910] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:00.885 [2024-07-15 20:22:53.195963] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:00.885 [2024-07-15 20:22:53.195983] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24bd350 00:09:00.885 [2024-07-15 20:22:53.195996] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:00.885 [2024-07-15 20:22:53.197367] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:00.885 [2024-07-15 20:22:53.197394] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:00.885 TestPT 00:09:00.885 20:22:53 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:00.885 20:22:53 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:09:01.143 5000+0 records in 00:09:01.143 5000+0 records out 00:09:01.143 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0268744 s, 381 MB/s 00:09:01.143 20:22:53 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.143 AIO0 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.143 20:22:53 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.143 20:22:53 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:09:01.143 20:22:53 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.143 20:22:53 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.143 20:22:53 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.143 20:22:53 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:09:01.143 20:22:53 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:09:01.143 20:22:53 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.143 20:22:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.403 20:22:53 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.403 20:22:53 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:09:01.403 20:22:53 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:09:01.404 20:22:53 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "da7d2bbe-1dd6-4ece-8a52-d9553dc7ea3d"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "da7d2bbe-1dd6-4ece-8a52-d9553dc7ea3d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "db1897e5-39ac-5768-81dd-3d3a79e4b3a5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "db1897e5-39ac-5768-81dd-3d3a79e4b3a5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "8b4d0916-cd51-57c8-88c0-8cadbeebfc1a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "8b4d0916-cd51-57c8-88c0-8cadbeebfc1a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "578a9574-c863-5868-a858-59e89633b0cf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "578a9574-c863-5868-a858-59e89633b0cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "bf012b4b-9dc4-5601-b7d6-217091c05cd0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bf012b4b-9dc4-5601-b7d6-217091c05cd0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "6ce4ff41-a2c1-5caa-980b-d60983f65866"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6ce4ff41-a2c1-5caa-980b-d60983f65866",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "89cfa520-4957-5463-bd86-8aee62eaf39d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "89cfa520-4957-5463-bd86-8aee62eaf39d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "df986755-3d3f-5326-a5c8-ba05792696ce"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "df986755-3d3f-5326-a5c8-ba05792696ce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "cd81cdd8-e3cc-5926-8c13-a936ab352780"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cd81cdd8-e3cc-5926-8c13-a936ab352780",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "cd45280c-d012-50d9-943f-4b6db4d90660"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cd45280c-d012-50d9-943f-4b6db4d90660",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "b9275732-b326-5edb-a23e-489d11f0667f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b9275732-b326-5edb-a23e-489d11f0667f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "69161744-fa9d-57b8-8931-dbce8dac36fa"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "69161744-fa9d-57b8-8931-dbce8dac36fa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "cf99e971-47e4-46cc-812f-e5ecb74c68aa"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "cf99e971-47e4-46cc-812f-e5ecb74c68aa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "cf99e971-47e4-46cc-812f-e5ecb74c68aa",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "d0bf7a32-23de-4dbe-8d9d-2fd508199cb6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "d9fe079f-538a-477f-ba47-fb0e4ede9b8a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "a4177c69-0ebc-4d38-aef4-e8e1d1d0917e"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a4177c69-0ebc-4d38-aef4-e8e1d1d0917e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a4177c69-0ebc-4d38-aef4-e8e1d1d0917e",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "ee065e69-5e1e-485b-8c30-b5b993b14a6e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "6c7095a7-e5a9-40ee-a31b-b3eefc30ab4c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "56344453-fb80-49e8-beab-3b32e30d6861"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "56344453-fb80-49e8-beab-3b32e30d6861",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "56344453-fb80-49e8-beab-3b32e30d6861",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a6308d26-b4f1-4935-a950-8637a2e3c001",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "5b76f1b7-582f-48a5-b79d-7f09c50c8f7c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "c9ed82cf-cc2a-4a60-9bc8-f26591e6d574"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "c9ed82cf-cc2a-4a60-9bc8-f26591e6d574",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:01.404 20:22:53 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:09:01.404 20:22:53 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:09:01.404 20:22:53 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:09:01.404 20:22:53 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 1328574 00:09:01.404 20:22:53 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 1328574 ']' 00:09:01.404 20:22:53 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 1328574 00:09:01.404 20:22:53 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:09:01.404 20:22:53 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:01.404 20:22:53 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1328574 00:09:01.404 20:22:53 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:01.404 20:22:53 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:01.404 20:22:53 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1328574' 00:09:01.404 killing process with pid 1328574 00:09:01.404 20:22:53 blockdev_general -- common/autotest_common.sh@967 -- # kill 1328574 00:09:01.404 20:22:53 blockdev_general -- common/autotest_common.sh@972 -- # wait 1328574 00:09:01.972 20:22:54 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:01.972 20:22:54 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:01.972 20:22:54 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:01.972 20:22:54 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:01.972 20:22:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.972 ************************************ 00:09:01.972 START TEST bdev_hello_world 00:09:01.972 ************************************ 00:09:01.972 20:22:54 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:01.972 [2024-07-15 20:22:54.330568] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:09:01.972 [2024-07-15 20:22:54.330626] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1328949 ] 00:09:02.254 [2024-07-15 20:22:54.459226] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:02.254 [2024-07-15 20:22:54.563349] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:02.513 [2024-07-15 20:22:54.720389] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:02.513 [2024-07-15 20:22:54.720441] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:02.513 [2024-07-15 20:22:54.720457] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:02.513 [2024-07-15 20:22:54.728394] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:02.513 [2024-07-15 20:22:54.728421] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:02.513 [2024-07-15 20:22:54.736405] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:02.513 [2024-07-15 20:22:54.736429] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:02.513 [2024-07-15 20:22:54.813772] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:02.513 [2024-07-15 20:22:54.813823] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:02.513 [2024-07-15 20:22:54.813840] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x127cfb0 00:09:02.513 [2024-07-15 20:22:54.813853] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:02.513 [2024-07-15 20:22:54.815288] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:02.513 [2024-07-15 20:22:54.815318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:02.772 [2024-07-15 20:22:54.967365] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:09:02.772 [2024-07-15 20:22:54.967435] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:09:02.772 [2024-07-15 20:22:54.967492] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:09:02.772 [2024-07-15 20:22:54.967565] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:09:02.772 [2024-07-15 20:22:54.967642] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:09:02.772 [2024-07-15 20:22:54.967673] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:09:02.772 [2024-07-15 20:22:54.967735] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:09:02.772 00:09:02.772 [2024-07-15 20:22:54.967775] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:09:03.032 00:09:03.032 real 0m1.036s 00:09:03.032 user 0m0.691s 00:09:03.032 sys 0m0.300s 00:09:03.032 20:22:55 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:03.032 20:22:55 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:09:03.032 ************************************ 00:09:03.032 END TEST bdev_hello_world 00:09:03.032 ************************************ 00:09:03.032 20:22:55 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:03.032 20:22:55 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:09:03.032 20:22:55 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:03.032 20:22:55 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:03.032 20:22:55 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:03.032 ************************************ 00:09:03.032 START TEST bdev_bounds 00:09:03.032 ************************************ 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1329138 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1329138' 00:09:03.032 Process bdevio pid: 1329138 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1329138 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1329138 ']' 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:03.032 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:03.032 20:22:55 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:03.291 [2024-07-15 20:22:55.450936] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:09:03.291 [2024-07-15 20:22:55.451004] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1329138 ] 00:09:03.291 [2024-07-15 20:22:55.581815] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:03.549 [2024-07-15 20:22:55.692486] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:03.549 [2024-07-15 20:22:55.692520] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.549 [2024-07-15 20:22:55.692519] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:03.549 [2024-07-15 20:22:55.856374] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:03.549 [2024-07-15 20:22:55.856429] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:03.549 [2024-07-15 20:22:55.856443] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:03.549 [2024-07-15 20:22:55.864391] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:03.550 [2024-07-15 20:22:55.864419] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:03.550 [2024-07-15 20:22:55.872405] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:03.550 [2024-07-15 20:22:55.872432] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:03.807 [2024-07-15 20:22:55.946142] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:03.807 [2024-07-15 20:22:55.946191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:03.807 [2024-07-15 20:22:55.946215] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x119b760 00:09:03.807 [2024-07-15 20:22:55.946228] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:03.807 [2024-07-15 20:22:55.947677] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:03.807 [2024-07-15 20:22:55.947707] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:04.375 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:04.375 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:09:04.375 20:22:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:09:04.375 I/O targets: 00:09:04.375 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:09:04.375 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:09:04.375 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:09:04.375 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:09:04.375 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:09:04.375 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:09:04.375 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:09:04.375 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:09:04.375 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:09:04.375 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:09:04.375 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:09:04.375 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:09:04.375 raid0: 131072 blocks of 512 bytes (64 MiB) 00:09:04.375 concat0: 131072 blocks of 512 bytes (64 MiB) 00:09:04.375 raid1: 65536 blocks of 512 bytes (32 MiB) 00:09:04.375 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:09:04.375 00:09:04.375 00:09:04.375 CUnit - A unit testing framework for C - Version 2.1-3 00:09:04.375 http://cunit.sourceforge.net/ 00:09:04.375 00:09:04.375 00:09:04.375 Suite: bdevio tests on: AIO0 00:09:04.375 Test: blockdev write read block ...passed 00:09:04.375 Test: blockdev write zeroes read block ...passed 00:09:04.375 Test: blockdev write zeroes read no split ...passed 00:09:04.375 Test: blockdev write zeroes read split ...passed 00:09:04.375 Test: blockdev write zeroes read split partial ...passed 00:09:04.375 Test: blockdev reset ...passed 00:09:04.375 Test: blockdev write read 8 blocks ...passed 00:09:04.375 Test: blockdev write read size > 128k ...passed 00:09:04.375 Test: blockdev write read invalid size ...passed 00:09:04.375 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.375 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.375 Test: blockdev write read max offset ...passed 00:09:04.375 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.375 Test: blockdev writev readv 8 blocks ...passed 00:09:04.375 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.375 Test: blockdev writev readv block ...passed 00:09:04.375 Test: blockdev writev readv size > 128k ...passed 00:09:04.375 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.375 Test: blockdev comparev and writev ...passed 00:09:04.375 Test: blockdev nvme passthru rw ...passed 00:09:04.375 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.375 Test: blockdev nvme admin passthru ...passed 00:09:04.375 Test: blockdev copy ...passed 00:09:04.375 Suite: bdevio tests on: raid1 00:09:04.375 Test: blockdev write read block ...passed 00:09:04.375 Test: blockdev write zeroes read block ...passed 00:09:04.375 Test: blockdev write zeroes read no split ...passed 00:09:04.375 Test: blockdev write zeroes read split ...passed 00:09:04.375 Test: blockdev write zeroes read split partial ...passed 00:09:04.375 Test: blockdev reset ...passed 00:09:04.375 Test: blockdev write read 8 blocks ...passed 00:09:04.375 Test: blockdev write read size > 128k ...passed 00:09:04.375 Test: blockdev write read invalid size ...passed 00:09:04.375 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.375 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.375 Test: blockdev write read max offset ...passed 00:09:04.375 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.375 Test: blockdev writev readv 8 blocks ...passed 00:09:04.375 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.375 Test: blockdev writev readv block ...passed 00:09:04.375 Test: blockdev writev readv size > 128k ...passed 00:09:04.375 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.375 Test: blockdev comparev and writev ...passed 00:09:04.375 Test: blockdev nvme passthru rw ...passed 00:09:04.375 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.375 Test: blockdev nvme admin passthru ...passed 00:09:04.375 Test: blockdev copy ...passed 00:09:04.375 Suite: bdevio tests on: concat0 00:09:04.375 Test: blockdev write read block ...passed 00:09:04.375 Test: blockdev write zeroes read block ...passed 00:09:04.375 Test: blockdev write zeroes read no split ...passed 00:09:04.375 Test: blockdev write zeroes read split ...passed 00:09:04.375 Test: blockdev write zeroes read split partial ...passed 00:09:04.375 Test: blockdev reset ...passed 00:09:04.375 Test: blockdev write read 8 blocks ...passed 00:09:04.375 Test: blockdev write read size > 128k ...passed 00:09:04.375 Test: blockdev write read invalid size ...passed 00:09:04.375 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.375 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.375 Test: blockdev write read max offset ...passed 00:09:04.375 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.375 Test: blockdev writev readv 8 blocks ...passed 00:09:04.375 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.375 Test: blockdev writev readv block ...passed 00:09:04.375 Test: blockdev writev readv size > 128k ...passed 00:09:04.375 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.375 Test: blockdev comparev and writev ...passed 00:09:04.375 Test: blockdev nvme passthru rw ...passed 00:09:04.375 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.375 Test: blockdev nvme admin passthru ...passed 00:09:04.375 Test: blockdev copy ...passed 00:09:04.375 Suite: bdevio tests on: raid0 00:09:04.375 Test: blockdev write read block ...passed 00:09:04.375 Test: blockdev write zeroes read block ...passed 00:09:04.375 Test: blockdev write zeroes read no split ...passed 00:09:04.375 Test: blockdev write zeroes read split ...passed 00:09:04.375 Test: blockdev write zeroes read split partial ...passed 00:09:04.375 Test: blockdev reset ...passed 00:09:04.375 Test: blockdev write read 8 blocks ...passed 00:09:04.375 Test: blockdev write read size > 128k ...passed 00:09:04.375 Test: blockdev write read invalid size ...passed 00:09:04.375 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.375 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.375 Test: blockdev write read max offset ...passed 00:09:04.375 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.375 Test: blockdev writev readv 8 blocks ...passed 00:09:04.375 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.375 Test: blockdev writev readv block ...passed 00:09:04.375 Test: blockdev writev readv size > 128k ...passed 00:09:04.375 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.375 Test: blockdev comparev and writev ...passed 00:09:04.375 Test: blockdev nvme passthru rw ...passed 00:09:04.375 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.375 Test: blockdev nvme admin passthru ...passed 00:09:04.375 Test: blockdev copy ...passed 00:09:04.375 Suite: bdevio tests on: TestPT 00:09:04.375 Test: blockdev write read block ...passed 00:09:04.375 Test: blockdev write zeroes read block ...passed 00:09:04.375 Test: blockdev write zeroes read no split ...passed 00:09:04.375 Test: blockdev write zeroes read split ...passed 00:09:04.375 Test: blockdev write zeroes read split partial ...passed 00:09:04.375 Test: blockdev reset ...passed 00:09:04.375 Test: blockdev write read 8 blocks ...passed 00:09:04.375 Test: blockdev write read size > 128k ...passed 00:09:04.375 Test: blockdev write read invalid size ...passed 00:09:04.375 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.375 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.375 Test: blockdev write read max offset ...passed 00:09:04.375 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.375 Test: blockdev writev readv 8 blocks ...passed 00:09:04.375 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.375 Test: blockdev writev readv block ...passed 00:09:04.375 Test: blockdev writev readv size > 128k ...passed 00:09:04.375 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.375 Test: blockdev comparev and writev ...passed 00:09:04.375 Test: blockdev nvme passthru rw ...passed 00:09:04.375 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.375 Test: blockdev nvme admin passthru ...passed 00:09:04.375 Test: blockdev copy ...passed 00:09:04.375 Suite: bdevio tests on: Malloc2p7 00:09:04.375 Test: blockdev write read block ...passed 00:09:04.375 Test: blockdev write zeroes read block ...passed 00:09:04.375 Test: blockdev write zeroes read no split ...passed 00:09:04.375 Test: blockdev write zeroes read split ...passed 00:09:04.376 Test: blockdev write zeroes read split partial ...passed 00:09:04.376 Test: blockdev reset ...passed 00:09:04.376 Test: blockdev write read 8 blocks ...passed 00:09:04.376 Test: blockdev write read size > 128k ...passed 00:09:04.376 Test: blockdev write read invalid size ...passed 00:09:04.376 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.376 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.376 Test: blockdev write read max offset ...passed 00:09:04.376 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.376 Test: blockdev writev readv 8 blocks ...passed 00:09:04.376 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.376 Test: blockdev writev readv block ...passed 00:09:04.376 Test: blockdev writev readv size > 128k ...passed 00:09:04.376 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.376 Test: blockdev comparev and writev ...passed 00:09:04.376 Test: blockdev nvme passthru rw ...passed 00:09:04.376 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.376 Test: blockdev nvme admin passthru ...passed 00:09:04.376 Test: blockdev copy ...passed 00:09:04.376 Suite: bdevio tests on: Malloc2p6 00:09:04.376 Test: blockdev write read block ...passed 00:09:04.376 Test: blockdev write zeroes read block ...passed 00:09:04.376 Test: blockdev write zeroes read no split ...passed 00:09:04.376 Test: blockdev write zeroes read split ...passed 00:09:04.376 Test: blockdev write zeroes read split partial ...passed 00:09:04.376 Test: blockdev reset ...passed 00:09:04.376 Test: blockdev write read 8 blocks ...passed 00:09:04.376 Test: blockdev write read size > 128k ...passed 00:09:04.376 Test: blockdev write read invalid size ...passed 00:09:04.376 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.376 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.376 Test: blockdev write read max offset ...passed 00:09:04.376 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.376 Test: blockdev writev readv 8 blocks ...passed 00:09:04.376 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.376 Test: blockdev writev readv block ...passed 00:09:04.376 Test: blockdev writev readv size > 128k ...passed 00:09:04.376 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.376 Test: blockdev comparev and writev ...passed 00:09:04.376 Test: blockdev nvme passthru rw ...passed 00:09:04.376 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.376 Test: blockdev nvme admin passthru ...passed 00:09:04.376 Test: blockdev copy ...passed 00:09:04.376 Suite: bdevio tests on: Malloc2p5 00:09:04.376 Test: blockdev write read block ...passed 00:09:04.376 Test: blockdev write zeroes read block ...passed 00:09:04.376 Test: blockdev write zeroes read no split ...passed 00:09:04.376 Test: blockdev write zeroes read split ...passed 00:09:04.376 Test: blockdev write zeroes read split partial ...passed 00:09:04.376 Test: blockdev reset ...passed 00:09:04.376 Test: blockdev write read 8 blocks ...passed 00:09:04.376 Test: blockdev write read size > 128k ...passed 00:09:04.376 Test: blockdev write read invalid size ...passed 00:09:04.376 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.376 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.376 Test: blockdev write read max offset ...passed 00:09:04.376 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.376 Test: blockdev writev readv 8 blocks ...passed 00:09:04.376 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.376 Test: blockdev writev readv block ...passed 00:09:04.376 Test: blockdev writev readv size > 128k ...passed 00:09:04.376 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.376 Test: blockdev comparev and writev ...passed 00:09:04.376 Test: blockdev nvme passthru rw ...passed 00:09:04.376 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.376 Test: blockdev nvme admin passthru ...passed 00:09:04.376 Test: blockdev copy ...passed 00:09:04.376 Suite: bdevio tests on: Malloc2p4 00:09:04.376 Test: blockdev write read block ...passed 00:09:04.376 Test: blockdev write zeroes read block ...passed 00:09:04.376 Test: blockdev write zeroes read no split ...passed 00:09:04.376 Test: blockdev write zeroes read split ...passed 00:09:04.376 Test: blockdev write zeroes read split partial ...passed 00:09:04.376 Test: blockdev reset ...passed 00:09:04.376 Test: blockdev write read 8 blocks ...passed 00:09:04.376 Test: blockdev write read size > 128k ...passed 00:09:04.376 Test: blockdev write read invalid size ...passed 00:09:04.376 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.376 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.376 Test: blockdev write read max offset ...passed 00:09:04.376 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.376 Test: blockdev writev readv 8 blocks ...passed 00:09:04.376 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.376 Test: blockdev writev readv block ...passed 00:09:04.376 Test: blockdev writev readv size > 128k ...passed 00:09:04.376 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.376 Test: blockdev comparev and writev ...passed 00:09:04.376 Test: blockdev nvme passthru rw ...passed 00:09:04.376 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.376 Test: blockdev nvme admin passthru ...passed 00:09:04.376 Test: blockdev copy ...passed 00:09:04.376 Suite: bdevio tests on: Malloc2p3 00:09:04.376 Test: blockdev write read block ...passed 00:09:04.376 Test: blockdev write zeroes read block ...passed 00:09:04.376 Test: blockdev write zeroes read no split ...passed 00:09:04.376 Test: blockdev write zeroes read split ...passed 00:09:04.636 Test: blockdev write zeroes read split partial ...passed 00:09:04.636 Test: blockdev reset ...passed 00:09:04.636 Test: blockdev write read 8 blocks ...passed 00:09:04.636 Test: blockdev write read size > 128k ...passed 00:09:04.636 Test: blockdev write read invalid size ...passed 00:09:04.636 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.636 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.636 Test: blockdev write read max offset ...passed 00:09:04.636 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.636 Test: blockdev writev readv 8 blocks ...passed 00:09:04.636 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.636 Test: blockdev writev readv block ...passed 00:09:04.636 Test: blockdev writev readv size > 128k ...passed 00:09:04.636 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.636 Test: blockdev comparev and writev ...passed 00:09:04.636 Test: blockdev nvme passthru rw ...passed 00:09:04.636 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.636 Test: blockdev nvme admin passthru ...passed 00:09:04.636 Test: blockdev copy ...passed 00:09:04.636 Suite: bdevio tests on: Malloc2p2 00:09:04.636 Test: blockdev write read block ...passed 00:09:04.636 Test: blockdev write zeroes read block ...passed 00:09:04.636 Test: blockdev write zeroes read no split ...passed 00:09:04.636 Test: blockdev write zeroes read split ...passed 00:09:04.636 Test: blockdev write zeroes read split partial ...passed 00:09:04.636 Test: blockdev reset ...passed 00:09:04.636 Test: blockdev write read 8 blocks ...passed 00:09:04.636 Test: blockdev write read size > 128k ...passed 00:09:04.636 Test: blockdev write read invalid size ...passed 00:09:04.636 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.636 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.636 Test: blockdev write read max offset ...passed 00:09:04.636 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.636 Test: blockdev writev readv 8 blocks ...passed 00:09:04.636 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.636 Test: blockdev writev readv block ...passed 00:09:04.636 Test: blockdev writev readv size > 128k ...passed 00:09:04.636 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.636 Test: blockdev comparev and writev ...passed 00:09:04.636 Test: blockdev nvme passthru rw ...passed 00:09:04.636 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.636 Test: blockdev nvme admin passthru ...passed 00:09:04.636 Test: blockdev copy ...passed 00:09:04.636 Suite: bdevio tests on: Malloc2p1 00:09:04.636 Test: blockdev write read block ...passed 00:09:04.636 Test: blockdev write zeroes read block ...passed 00:09:04.636 Test: blockdev write zeroes read no split ...passed 00:09:04.636 Test: blockdev write zeroes read split ...passed 00:09:04.636 Test: blockdev write zeroes read split partial ...passed 00:09:04.636 Test: blockdev reset ...passed 00:09:04.636 Test: blockdev write read 8 blocks ...passed 00:09:04.636 Test: blockdev write read size > 128k ...passed 00:09:04.636 Test: blockdev write read invalid size ...passed 00:09:04.636 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.636 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.636 Test: blockdev write read max offset ...passed 00:09:04.636 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.636 Test: blockdev writev readv 8 blocks ...passed 00:09:04.636 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.636 Test: blockdev writev readv block ...passed 00:09:04.636 Test: blockdev writev readv size > 128k ...passed 00:09:04.636 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.636 Test: blockdev comparev and writev ...passed 00:09:04.636 Test: blockdev nvme passthru rw ...passed 00:09:04.636 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.636 Test: blockdev nvme admin passthru ...passed 00:09:04.636 Test: blockdev copy ...passed 00:09:04.636 Suite: bdevio tests on: Malloc2p0 00:09:04.636 Test: blockdev write read block ...passed 00:09:04.636 Test: blockdev write zeroes read block ...passed 00:09:04.636 Test: blockdev write zeroes read no split ...passed 00:09:04.636 Test: blockdev write zeroes read split ...passed 00:09:04.636 Test: blockdev write zeroes read split partial ...passed 00:09:04.636 Test: blockdev reset ...passed 00:09:04.636 Test: blockdev write read 8 blocks ...passed 00:09:04.636 Test: blockdev write read size > 128k ...passed 00:09:04.636 Test: blockdev write read invalid size ...passed 00:09:04.636 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.636 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.636 Test: blockdev write read max offset ...passed 00:09:04.636 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.636 Test: blockdev writev readv 8 blocks ...passed 00:09:04.636 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.636 Test: blockdev writev readv block ...passed 00:09:04.636 Test: blockdev writev readv size > 128k ...passed 00:09:04.636 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.636 Test: blockdev comparev and writev ...passed 00:09:04.636 Test: blockdev nvme passthru rw ...passed 00:09:04.636 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.636 Test: blockdev nvme admin passthru ...passed 00:09:04.636 Test: blockdev copy ...passed 00:09:04.636 Suite: bdevio tests on: Malloc1p1 00:09:04.636 Test: blockdev write read block ...passed 00:09:04.636 Test: blockdev write zeroes read block ...passed 00:09:04.636 Test: blockdev write zeroes read no split ...passed 00:09:04.636 Test: blockdev write zeroes read split ...passed 00:09:04.636 Test: blockdev write zeroes read split partial ...passed 00:09:04.636 Test: blockdev reset ...passed 00:09:04.636 Test: blockdev write read 8 blocks ...passed 00:09:04.636 Test: blockdev write read size > 128k ...passed 00:09:04.636 Test: blockdev write read invalid size ...passed 00:09:04.636 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.636 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.636 Test: blockdev write read max offset ...passed 00:09:04.636 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.636 Test: blockdev writev readv 8 blocks ...passed 00:09:04.636 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.636 Test: blockdev writev readv block ...passed 00:09:04.636 Test: blockdev writev readv size > 128k ...passed 00:09:04.636 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.636 Test: blockdev comparev and writev ...passed 00:09:04.636 Test: blockdev nvme passthru rw ...passed 00:09:04.636 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.636 Test: blockdev nvme admin passthru ...passed 00:09:04.636 Test: blockdev copy ...passed 00:09:04.636 Suite: bdevio tests on: Malloc1p0 00:09:04.636 Test: blockdev write read block ...passed 00:09:04.636 Test: blockdev write zeroes read block ...passed 00:09:04.636 Test: blockdev write zeroes read no split ...passed 00:09:04.636 Test: blockdev write zeroes read split ...passed 00:09:04.636 Test: blockdev write zeroes read split partial ...passed 00:09:04.636 Test: blockdev reset ...passed 00:09:04.636 Test: blockdev write read 8 blocks ...passed 00:09:04.636 Test: blockdev write read size > 128k ...passed 00:09:04.636 Test: blockdev write read invalid size ...passed 00:09:04.636 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.636 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.636 Test: blockdev write read max offset ...passed 00:09:04.636 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.636 Test: blockdev writev readv 8 blocks ...passed 00:09:04.636 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.636 Test: blockdev writev readv block ...passed 00:09:04.636 Test: blockdev writev readv size > 128k ...passed 00:09:04.636 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.636 Test: blockdev comparev and writev ...passed 00:09:04.636 Test: blockdev nvme passthru rw ...passed 00:09:04.636 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.636 Test: blockdev nvme admin passthru ...passed 00:09:04.636 Test: blockdev copy ...passed 00:09:04.636 Suite: bdevio tests on: Malloc0 00:09:04.636 Test: blockdev write read block ...passed 00:09:04.636 Test: blockdev write zeroes read block ...passed 00:09:04.636 Test: blockdev write zeroes read no split ...passed 00:09:04.636 Test: blockdev write zeroes read split ...passed 00:09:04.636 Test: blockdev write zeroes read split partial ...passed 00:09:04.636 Test: blockdev reset ...passed 00:09:04.636 Test: blockdev write read 8 blocks ...passed 00:09:04.636 Test: blockdev write read size > 128k ...passed 00:09:04.636 Test: blockdev write read invalid size ...passed 00:09:04.636 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.636 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.636 Test: blockdev write read max offset ...passed 00:09:04.636 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.636 Test: blockdev writev readv 8 blocks ...passed 00:09:04.636 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.636 Test: blockdev writev readv block ...passed 00:09:04.636 Test: blockdev writev readv size > 128k ...passed 00:09:04.636 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.636 Test: blockdev comparev and writev ...passed 00:09:04.636 Test: blockdev nvme passthru rw ...passed 00:09:04.636 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.636 Test: blockdev nvme admin passthru ...passed 00:09:04.637 Test: blockdev copy ...passed 00:09:04.637 00:09:04.637 Run Summary: Type Total Ran Passed Failed Inactive 00:09:04.637 suites 16 16 n/a 0 0 00:09:04.637 tests 368 368 368 0 0 00:09:04.637 asserts 2224 2224 2224 0 n/a 00:09:04.637 00:09:04.637 Elapsed time = 0.679 seconds 00:09:04.637 0 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1329138 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1329138 ']' 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1329138 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1329138 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1329138' 00:09:04.637 killing process with pid 1329138 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1329138 00:09:04.637 20:22:56 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1329138 00:09:04.896 20:22:57 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:09:04.896 00:09:04.896 real 0m1.851s 00:09:04.896 user 0m4.618s 00:09:04.896 sys 0m0.505s 00:09:04.896 20:22:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.896 20:22:57 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:04.896 ************************************ 00:09:04.896 END TEST bdev_bounds 00:09:04.896 ************************************ 00:09:05.156 20:22:57 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:05.156 20:22:57 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:05.156 20:22:57 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:05.156 20:22:57 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:05.156 20:22:57 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.156 ************************************ 00:09:05.156 START TEST bdev_nbd 00:09:05.156 ************************************ 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1329357 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1329357 /var/tmp/spdk-nbd.sock 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1329357 ']' 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:05.156 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:05.156 20:22:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:05.156 [2024-07-15 20:22:57.398976] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:09:05.156 [2024-07-15 20:22:57.399046] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:05.156 [2024-07-15 20:22:57.518749] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:05.415 [2024-07-15 20:22:57.623652] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:05.415 [2024-07-15 20:22:57.788577] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:05.415 [2024-07-15 20:22:57.788639] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:05.415 [2024-07-15 20:22:57.788654] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:05.675 [2024-07-15 20:22:57.796587] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:05.675 [2024-07-15 20:22:57.796615] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:05.675 [2024-07-15 20:22:57.804599] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:05.675 [2024-07-15 20:22:57.804626] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:05.675 [2024-07-15 20:22:57.881861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:05.675 [2024-07-15 20:22:57.881909] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:05.675 [2024-07-15 20:22:57.881936] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x244b4f0 00:09:05.675 [2024-07-15 20:22:57.881949] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:05.675 [2024-07-15 20:22:57.883372] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:05.675 [2024-07-15 20:22:57.883400] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:09:06.244 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:09:06.502 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:09:06.502 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:09:06.502 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:06.502 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:06.502 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:06.502 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:06.502 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:06.502 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:06.502 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:06.503 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:06.503 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:06.503 1+0 records in 00:09:06.503 1+0 records out 00:09:06.503 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271998 s, 15.1 MB/s 00:09:06.503 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.503 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:06.503 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.503 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:06.503 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:06.503 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:06.503 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:06.503 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:06.762 1+0 records in 00:09:06.762 1+0 records out 00:09:06.762 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000220388 s, 18.6 MB/s 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:06.762 20:22:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.021 1+0 records in 00:09:07.021 1+0 records out 00:09:07.021 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278816 s, 14.7 MB/s 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:07.021 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:07.022 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.022 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.022 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.281 1+0 records in 00:09:07.281 1+0 records out 00:09:07.281 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000379938 s, 10.8 MB/s 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.281 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:09:07.849 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:09:07.850 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:09:07.850 20:22:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:09:07.850 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:07.850 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:07.850 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:07.850 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:07.850 20:22:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.850 1+0 records in 00:09:07.850 1+0 records out 00:09:07.850 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000366175 s, 11.2 MB/s 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.850 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.109 1+0 records in 00:09:08.109 1+0 records out 00:09:08.109 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000419586 s, 9.8 MB/s 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.109 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.368 1+0 records in 00:09:08.368 1+0 records out 00:09:08.368 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000470625 s, 8.7 MB/s 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.368 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.627 1+0 records in 00:09:08.627 1+0 records out 00:09:08.627 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000517036 s, 7.9 MB/s 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.627 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:08.628 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.628 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:08.628 20:23:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:08.628 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.628 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.628 20:23:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.887 1+0 records in 00:09:08.887 1+0 records out 00:09:08.887 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000598478 s, 6.8 MB/s 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.887 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.146 1+0 records in 00:09:09.146 1+0 records out 00:09:09.146 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000425182 s, 9.6 MB/s 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.146 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.716 1+0 records in 00:09:09.716 1+0 records out 00:09:09.716 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000487815 s, 8.4 MB/s 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.716 20:23:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.036 1+0 records in 00:09:10.036 1+0 records out 00:09:10.036 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000540381 s, 7.6 MB/s 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.036 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:10.295 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.554 1+0 records in 00:09:10.554 1+0 records out 00:09:10.554 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000668887 s, 6.1 MB/s 00:09:10.554 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.554 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:10.554 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.554 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:10.554 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:10.554 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.554 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.554 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.813 1+0 records in 00:09:10.813 1+0 records out 00:09:10.813 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000535587 s, 7.6 MB/s 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.813 20:23:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:11.072 1+0 records in 00:09:11.072 1+0 records out 00:09:11.072 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000723022 s, 5.7 MB/s 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:11.072 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:11.073 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:11.640 1+0 records in 00:09:11.640 1+0 records out 00:09:11.640 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000778257 s, 5.3 MB/s 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:11.640 20:23:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:12.208 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd0", 00:09:12.208 "bdev_name": "Malloc0" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd1", 00:09:12.208 "bdev_name": "Malloc1p0" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd2", 00:09:12.208 "bdev_name": "Malloc1p1" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd3", 00:09:12.208 "bdev_name": "Malloc2p0" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd4", 00:09:12.208 "bdev_name": "Malloc2p1" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd5", 00:09:12.208 "bdev_name": "Malloc2p2" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd6", 00:09:12.208 "bdev_name": "Malloc2p3" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd7", 00:09:12.208 "bdev_name": "Malloc2p4" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd8", 00:09:12.208 "bdev_name": "Malloc2p5" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd9", 00:09:12.208 "bdev_name": "Malloc2p6" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd10", 00:09:12.208 "bdev_name": "Malloc2p7" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd11", 00:09:12.208 "bdev_name": "TestPT" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd12", 00:09:12.208 "bdev_name": "raid0" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd13", 00:09:12.208 "bdev_name": "concat0" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd14", 00:09:12.208 "bdev_name": "raid1" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd15", 00:09:12.208 "bdev_name": "AIO0" 00:09:12.208 } 00:09:12.208 ]' 00:09:12.208 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:09:12.208 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd0", 00:09:12.208 "bdev_name": "Malloc0" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd1", 00:09:12.208 "bdev_name": "Malloc1p0" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd2", 00:09:12.208 "bdev_name": "Malloc1p1" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd3", 00:09:12.208 "bdev_name": "Malloc2p0" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd4", 00:09:12.208 "bdev_name": "Malloc2p1" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd5", 00:09:12.208 "bdev_name": "Malloc2p2" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd6", 00:09:12.208 "bdev_name": "Malloc2p3" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd7", 00:09:12.208 "bdev_name": "Malloc2p4" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd8", 00:09:12.208 "bdev_name": "Malloc2p5" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd9", 00:09:12.208 "bdev_name": "Malloc2p6" 00:09:12.208 }, 00:09:12.208 { 00:09:12.208 "nbd_device": "/dev/nbd10", 00:09:12.208 "bdev_name": "Malloc2p7" 00:09:12.209 }, 00:09:12.209 { 00:09:12.209 "nbd_device": "/dev/nbd11", 00:09:12.209 "bdev_name": "TestPT" 00:09:12.209 }, 00:09:12.209 { 00:09:12.209 "nbd_device": "/dev/nbd12", 00:09:12.209 "bdev_name": "raid0" 00:09:12.209 }, 00:09:12.209 { 00:09:12.209 "nbd_device": "/dev/nbd13", 00:09:12.209 "bdev_name": "concat0" 00:09:12.209 }, 00:09:12.209 { 00:09:12.209 "nbd_device": "/dev/nbd14", 00:09:12.209 "bdev_name": "raid1" 00:09:12.209 }, 00:09:12.209 { 00:09:12.209 "nbd_device": "/dev/nbd15", 00:09:12.209 "bdev_name": "AIO0" 00:09:12.209 } 00:09:12.209 ]' 00:09:12.209 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:09:12.209 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:09:12.209 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:12.209 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:09:12.209 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:12.209 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:12.209 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.209 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:12.468 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:12.468 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:12.468 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:12.468 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.468 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.468 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:12.468 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.468 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.468 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.468 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:12.727 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:12.727 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:12.727 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:12.727 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.727 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.727 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:12.727 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.727 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.727 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.727 20:23:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:12.985 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:12.985 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:12.985 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:12.985 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.985 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.985 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:12.985 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.985 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.985 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.985 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:13.252 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:13.252 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:13.253 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:13.253 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.253 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.253 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:13.253 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.253 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.253 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.253 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:13.511 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:13.511 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:13.512 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:13.512 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.512 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.512 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:13.512 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.512 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.512 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.512 20:23:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:13.770 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:13.770 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:13.770 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:13.770 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.770 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.770 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:13.770 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.770 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.770 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.770 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:14.029 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:14.029 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:14.029 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:14.029 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.029 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.029 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:14.029 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.029 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.029 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.029 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:14.288 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:14.288 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:14.288 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:14.288 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.288 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.288 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:14.288 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.288 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.288 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.288 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:14.546 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:14.546 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:14.546 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:14.546 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.546 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.805 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:14.805 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.805 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.805 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.805 20:23:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:15.063 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:15.063 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:15.063 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:15.063 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:15.063 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:15.063 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:15.063 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:15.063 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:15.063 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:15.063 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:15.321 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:15.321 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:15.321 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:15.321 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:15.321 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:15.321 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:15.321 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:15.321 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:15.321 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:15.322 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:15.580 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:15.580 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:15.580 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:15.580 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:15.580 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:15.580 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:15.580 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:15.580 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:15.580 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:15.580 20:23:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:15.838 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:15.838 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:15.838 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:15.838 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:15.838 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:15.838 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:15.838 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:15.838 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:15.838 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:15.838 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:16.096 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:16.096 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:16.096 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:16.096 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:16.096 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:16.096 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:16.096 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:16.096 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:16.096 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:16.096 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:16.355 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:16.355 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:16.355 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:16.355 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:16.355 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:16.355 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:16.355 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:16.355 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:16.355 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:16.355 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:16.613 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:16.613 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:16.613 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:16.613 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:16.613 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:16.613 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:16.613 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:16.613 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:16.614 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:16.614 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:16.614 20:23:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:16.872 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:16.873 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:17.131 /dev/nbd0 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.131 1+0 records in 00:09:17.131 1+0 records out 00:09:17.131 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281997 s, 14.5 MB/s 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.131 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:09:17.390 /dev/nbd1 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.390 1+0 records in 00:09:17.390 1+0 records out 00:09:17.390 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307403 s, 13.3 MB/s 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.390 20:23:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:09:17.648 /dev/nbd10 00:09:17.648 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.906 1+0 records in 00:09:17.906 1+0 records out 00:09:17.906 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306666 s, 13.4 MB/s 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.906 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:09:18.163 /dev/nbd11 00:09:18.163 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.164 1+0 records in 00:09:18.164 1+0 records out 00:09:18.164 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000395476 s, 10.4 MB/s 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.164 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:09:18.421 /dev/nbd12 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.421 1+0 records in 00:09:18.421 1+0 records out 00:09:18.421 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000402469 s, 10.2 MB/s 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.421 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:09:18.679 /dev/nbd13 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.679 1+0 records in 00:09:18.679 1+0 records out 00:09:18.679 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000342832 s, 11.9 MB/s 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.679 20:23:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:09:18.937 /dev/nbd14 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.937 1+0 records in 00:09:18.937 1+0 records out 00:09:18.937 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000470973 s, 8.7 MB/s 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.937 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:09:19.195 /dev/nbd15 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.195 1+0 records in 00:09:19.195 1+0 records out 00:09:19.195 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000536669 s, 7.6 MB/s 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:19.195 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:09:19.452 /dev/nbd2 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.452 1+0 records in 00:09:19.452 1+0 records out 00:09:19.452 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000486176 s, 8.4 MB/s 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:19.452 20:23:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:09:19.709 /dev/nbd3 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.709 1+0 records in 00:09:19.709 1+0 records out 00:09:19.709 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000448922 s, 9.1 MB/s 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:19.709 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:09:19.966 /dev/nbd4 00:09:19.966 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:09:19.966 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:09:19.966 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:19.966 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.966 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.966 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.966 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.223 1+0 records in 00:09:20.223 1+0 records out 00:09:20.223 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00051808 s, 7.9 MB/s 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:20.223 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:09:20.481 /dev/nbd5 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.481 1+0 records in 00:09:20.481 1+0 records out 00:09:20.481 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000769908 s, 5.3 MB/s 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:20.481 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:09:20.738 /dev/nbd6 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.738 1+0 records in 00:09:20.738 1+0 records out 00:09:20.738 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000872533 s, 4.7 MB/s 00:09:20.738 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.739 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:20.739 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.739 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:20.739 20:23:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:20.739 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:20.739 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:20.739 20:23:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:09:20.996 /dev/nbd7 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:20.996 1+0 records in 00:09:20.996 1+0 records out 00:09:20.996 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000809056 s, 5.1 MB/s 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:20.996 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:09:21.254 /dev/nbd8 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:21.254 1+0 records in 00:09:21.254 1+0 records out 00:09:21.254 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000847382 s, 4.8 MB/s 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:21.254 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:21.255 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:09:21.513 /dev/nbd9 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:21.513 1+0 records in 00:09:21.513 1+0 records out 00:09:21.513 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000816537 s, 5.0 MB/s 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:21.513 20:23:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:21.772 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:21.772 { 00:09:21.772 "nbd_device": "/dev/nbd0", 00:09:21.772 "bdev_name": "Malloc0" 00:09:21.772 }, 00:09:21.772 { 00:09:21.772 "nbd_device": "/dev/nbd1", 00:09:21.772 "bdev_name": "Malloc1p0" 00:09:21.772 }, 00:09:21.772 { 00:09:21.772 "nbd_device": "/dev/nbd10", 00:09:21.772 "bdev_name": "Malloc1p1" 00:09:21.772 }, 00:09:21.772 { 00:09:21.772 "nbd_device": "/dev/nbd11", 00:09:21.772 "bdev_name": "Malloc2p0" 00:09:21.772 }, 00:09:21.772 { 00:09:21.772 "nbd_device": "/dev/nbd12", 00:09:21.772 "bdev_name": "Malloc2p1" 00:09:21.772 }, 00:09:21.772 { 00:09:21.772 "nbd_device": "/dev/nbd13", 00:09:21.772 "bdev_name": "Malloc2p2" 00:09:21.772 }, 00:09:21.772 { 00:09:21.772 "nbd_device": "/dev/nbd14", 00:09:21.772 "bdev_name": "Malloc2p3" 00:09:21.772 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd15", 00:09:21.773 "bdev_name": "Malloc2p4" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd2", 00:09:21.773 "bdev_name": "Malloc2p5" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd3", 00:09:21.773 "bdev_name": "Malloc2p6" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd4", 00:09:21.773 "bdev_name": "Malloc2p7" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd5", 00:09:21.773 "bdev_name": "TestPT" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd6", 00:09:21.773 "bdev_name": "raid0" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd7", 00:09:21.773 "bdev_name": "concat0" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd8", 00:09:21.773 "bdev_name": "raid1" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd9", 00:09:21.773 "bdev_name": "AIO0" 00:09:21.773 } 00:09:21.773 ]' 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd0", 00:09:21.773 "bdev_name": "Malloc0" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd1", 00:09:21.773 "bdev_name": "Malloc1p0" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd10", 00:09:21.773 "bdev_name": "Malloc1p1" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd11", 00:09:21.773 "bdev_name": "Malloc2p0" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd12", 00:09:21.773 "bdev_name": "Malloc2p1" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd13", 00:09:21.773 "bdev_name": "Malloc2p2" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd14", 00:09:21.773 "bdev_name": "Malloc2p3" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd15", 00:09:21.773 "bdev_name": "Malloc2p4" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd2", 00:09:21.773 "bdev_name": "Malloc2p5" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd3", 00:09:21.773 "bdev_name": "Malloc2p6" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd4", 00:09:21.773 "bdev_name": "Malloc2p7" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd5", 00:09:21.773 "bdev_name": "TestPT" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd6", 00:09:21.773 "bdev_name": "raid0" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd7", 00:09:21.773 "bdev_name": "concat0" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd8", 00:09:21.773 "bdev_name": "raid1" 00:09:21.773 }, 00:09:21.773 { 00:09:21.773 "nbd_device": "/dev/nbd9", 00:09:21.773 "bdev_name": "AIO0" 00:09:21.773 } 00:09:21.773 ]' 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:21.773 /dev/nbd1 00:09:21.773 /dev/nbd10 00:09:21.773 /dev/nbd11 00:09:21.773 /dev/nbd12 00:09:21.773 /dev/nbd13 00:09:21.773 /dev/nbd14 00:09:21.773 /dev/nbd15 00:09:21.773 /dev/nbd2 00:09:21.773 /dev/nbd3 00:09:21.773 /dev/nbd4 00:09:21.773 /dev/nbd5 00:09:21.773 /dev/nbd6 00:09:21.773 /dev/nbd7 00:09:21.773 /dev/nbd8 00:09:21.773 /dev/nbd9' 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:21.773 /dev/nbd1 00:09:21.773 /dev/nbd10 00:09:21.773 /dev/nbd11 00:09:21.773 /dev/nbd12 00:09:21.773 /dev/nbd13 00:09:21.773 /dev/nbd14 00:09:21.773 /dev/nbd15 00:09:21.773 /dev/nbd2 00:09:21.773 /dev/nbd3 00:09:21.773 /dev/nbd4 00:09:21.773 /dev/nbd5 00:09:21.773 /dev/nbd6 00:09:21.773 /dev/nbd7 00:09:21.773 /dev/nbd8 00:09:21.773 /dev/nbd9' 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:09:21.773 256+0 records in 00:09:21.773 256+0 records out 00:09:21.773 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102397 s, 102 MB/s 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:21.773 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:22.032 256+0 records in 00:09:22.032 256+0 records out 00:09:22.032 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18217 s, 5.8 MB/s 00:09:22.032 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.032 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:22.291 256+0 records in 00:09:22.291 256+0 records out 00:09:22.291 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184539 s, 5.7 MB/s 00:09:22.291 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.291 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:09:22.550 256+0 records in 00:09:22.550 256+0 records out 00:09:22.550 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18389 s, 5.7 MB/s 00:09:22.550 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.550 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:09:22.550 256+0 records in 00:09:22.550 256+0 records out 00:09:22.550 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183097 s, 5.7 MB/s 00:09:22.551 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.551 20:23:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:09:22.810 256+0 records in 00:09:22.810 256+0 records out 00:09:22.810 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183954 s, 5.7 MB/s 00:09:22.810 20:23:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.810 20:23:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:09:23.068 256+0 records in 00:09:23.068 256+0 records out 00:09:23.068 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184363 s, 5.7 MB/s 00:09:23.068 20:23:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.068 20:23:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:09:23.068 256+0 records in 00:09:23.068 256+0 records out 00:09:23.068 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.180175 s, 5.8 MB/s 00:09:23.068 20:23:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.068 20:23:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:09:23.327 256+0 records in 00:09:23.327 256+0 records out 00:09:23.327 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183937 s, 5.7 MB/s 00:09:23.327 20:23:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.327 20:23:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:09:23.587 256+0 records in 00:09:23.587 256+0 records out 00:09:23.587 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184314 s, 5.7 MB/s 00:09:23.587 20:23:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.587 20:23:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:09:23.910 256+0 records in 00:09:23.911 256+0 records out 00:09:23.911 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184197 s, 5.7 MB/s 00:09:23.911 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.911 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:09:23.911 256+0 records in 00:09:23.911 256+0 records out 00:09:23.911 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.178454 s, 5.9 MB/s 00:09:23.911 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.911 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:09:24.204 256+0 records in 00:09:24.204 256+0 records out 00:09:24.204 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183183 s, 5.7 MB/s 00:09:24.204 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:24.205 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:09:24.205 256+0 records in 00:09:24.205 256+0 records out 00:09:24.205 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184596 s, 5.7 MB/s 00:09:24.205 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:24.205 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:09:24.463 256+0 records in 00:09:24.463 256+0 records out 00:09:24.463 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185541 s, 5.7 MB/s 00:09:24.463 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:24.463 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:09:24.722 256+0 records in 00:09:24.722 256+0 records out 00:09:24.722 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.188164 s, 5.6 MB/s 00:09:24.722 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:24.722 20:23:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:09:24.980 256+0 records in 00:09:24.980 256+0 records out 00:09:24.980 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182441 s, 5.7 MB/s 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.980 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:25.239 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:25.239 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:25.239 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:25.239 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.239 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.239 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:25.240 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.240 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.240 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.240 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:25.498 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:25.498 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:25.498 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:25.498 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.498 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.498 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:25.498 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.498 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.498 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.498 20:23:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:26.066 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:26.066 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:26.066 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:26.066 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.066 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.066 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:26.066 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.066 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.066 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.067 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.635 20:23:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:26.894 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:26.894 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:26.894 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:26.894 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.894 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.894 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:26.894 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.894 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.894 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.894 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:27.462 20:23:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:27.721 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:27.721 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:27.721 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:27.721 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:27.721 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:27.721 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:27.979 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:27.979 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:27.979 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:27.979 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:28.238 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:28.238 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:28.238 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:28.238 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:28.238 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:28.238 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:28.238 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:28.238 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:28.238 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:28.238 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:28.496 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:28.497 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:28.497 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:28.497 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:28.497 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:28.497 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:28.497 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:28.497 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:28.497 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:28.497 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:28.756 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:28.756 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:28.756 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:28.756 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:28.756 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:28.756 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:28.756 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:28.756 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:28.756 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:28.756 20:23:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:29.014 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:29.014 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:29.014 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:29.014 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:29.014 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:29.014 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:29.014 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:29.014 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:29.014 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:29.014 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:29.273 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:29.273 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:29.273 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:29.273 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:29.273 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:29.273 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:29.273 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:29.273 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:29.273 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:29.273 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:29.532 20:23:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:09:29.791 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:09:30.050 malloc_lvol_verify 00:09:30.050 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:09:30.618 bbbe82f0-a26d-4406-a4ab-c15b903a5888 00:09:30.618 20:23:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:09:30.877 5cf13e54-b3f5-48f1-9393-c0e71dc9d92f 00:09:30.877 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:09:31.135 /dev/nbd0 00:09:31.135 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:09:31.135 mke2fs 1.46.5 (30-Dec-2021) 00:09:31.135 Discarding device blocks: 0/4096 done 00:09:31.135 Creating filesystem with 4096 1k blocks and 1024 inodes 00:09:31.135 00:09:31.135 Allocating group tables: 0/1 done 00:09:31.135 Writing inode tables: 0/1 done 00:09:31.135 Creating journal (1024 blocks): done 00:09:31.135 Writing superblocks and filesystem accounting information: 0/1 done 00:09:31.135 00:09:31.135 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:09:31.135 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:09:31.135 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:31.135 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:31.135 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:31.135 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:31.135 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:31.135 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1329357 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1329357 ']' 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1329357 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1329357 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1329357' 00:09:31.394 killing process with pid 1329357 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1329357 00:09:31.394 20:23:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1329357 00:09:31.652 20:23:24 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:09:31.652 00:09:31.652 real 0m26.690s 00:09:31.652 user 0m33.270s 00:09:31.652 sys 0m15.440s 00:09:31.652 20:23:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:31.652 20:23:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:31.652 ************************************ 00:09:31.652 END TEST bdev_nbd 00:09:31.652 ************************************ 00:09:31.911 20:23:24 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:31.911 20:23:24 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:09:31.911 20:23:24 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:09:31.911 20:23:24 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:09:31.911 20:23:24 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:09:31.911 20:23:24 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:31.911 20:23:24 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:31.911 20:23:24 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:31.911 ************************************ 00:09:31.911 START TEST bdev_fio 00:09:31.911 ************************************ 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:31.911 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.911 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:31.912 20:23:24 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:31.912 ************************************ 00:09:31.912 START TEST bdev_fio_rw_verify 00:09:31.912 ************************************ 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:31.912 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:32.177 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:32.177 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:32.177 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:32.177 20:23:24 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:32.438 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:32.438 fio-3.35 00:09:32.438 Starting 16 threads 00:09:44.650 00:09:44.650 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=1333696: Mon Jul 15 20:23:35 2024 00:09:44.650 read: IOPS=84.7k, BW=331MiB/s (347MB/s)(3309MiB/10001msec) 00:09:44.650 slat (usec): min=5, max=725, avg=37.50, stdev=13.48 00:09:44.650 clat (usec): min=12, max=1337, avg=305.70, stdev=131.47 00:09:44.650 lat (usec): min=20, max=1390, avg=343.20, stdev=139.40 00:09:44.650 clat percentiles (usec): 00:09:44.650 | 50.000th=[ 297], 99.000th=[ 619], 99.900th=[ 693], 99.990th=[ 922], 00:09:44.650 | 99.999th=[ 1123] 00:09:44.650 write: IOPS=136k, BW=533MiB/s (559MB/s)(5252MiB/9860msec); 0 zone resets 00:09:44.650 slat (usec): min=8, max=4439, avg=50.68, stdev=15.74 00:09:44.650 clat (usec): min=10, max=4841, avg=356.16, stdev=159.87 00:09:44.650 lat (usec): min=39, max=4888, avg=406.84, stdev=168.27 00:09:44.650 clat percentiles (usec): 00:09:44.650 | 50.000th=[ 338], 99.000th=[ 791], 99.900th=[ 1057], 99.990th=[ 1156], 00:09:44.650 | 99.999th=[ 1467] 00:09:44.650 bw ( KiB/s): min=426744, max=692094, per=98.87%, avg=539295.21, stdev=4839.99, samples=304 00:09:44.650 iops : min=106686, max=173022, avg=134823.68, stdev=1209.98, samples=304 00:09:44.650 lat (usec) : 20=0.01%, 50=0.31%, 100=3.00%, 250=28.94%, 500=53.55% 00:09:44.650 lat (usec) : 750=13.27%, 1000=0.81% 00:09:44.650 lat (msec) : 2=0.12%, 4=0.01%, 10=0.01% 00:09:44.650 cpu : usr=99.22%, sys=0.34%, ctx=692, majf=0, minf=1881 00:09:44.650 IO depths : 1=12.6%, 2=25.1%, 4=49.9%, 8=12.4%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:44.650 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:44.650 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:44.650 issued rwts: total=847082,1344598,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:44.650 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:44.650 00:09:44.650 Run status group 0 (all jobs): 00:09:44.650 READ: bw=331MiB/s (347MB/s), 331MiB/s-331MiB/s (347MB/s-347MB/s), io=3309MiB (3470MB), run=10001-10001msec 00:09:44.650 WRITE: bw=533MiB/s (559MB/s), 533MiB/s-533MiB/s (559MB/s-559MB/s), io=5252MiB (5507MB), run=9860-9860msec 00:09:44.650 00:09:44.650 real 0m11.825s 00:09:44.650 user 2m45.622s 00:09:44.650 sys 0m1.384s 00:09:44.650 20:23:36 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:44.650 20:23:36 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:44.650 ************************************ 00:09:44.650 END TEST bdev_fio_rw_verify 00:09:44.650 ************************************ 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:44.650 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:44.652 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "da7d2bbe-1dd6-4ece-8a52-d9553dc7ea3d"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "da7d2bbe-1dd6-4ece-8a52-d9553dc7ea3d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "db1897e5-39ac-5768-81dd-3d3a79e4b3a5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "db1897e5-39ac-5768-81dd-3d3a79e4b3a5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "8b4d0916-cd51-57c8-88c0-8cadbeebfc1a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "8b4d0916-cd51-57c8-88c0-8cadbeebfc1a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "578a9574-c863-5868-a858-59e89633b0cf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "578a9574-c863-5868-a858-59e89633b0cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "bf012b4b-9dc4-5601-b7d6-217091c05cd0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bf012b4b-9dc4-5601-b7d6-217091c05cd0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "6ce4ff41-a2c1-5caa-980b-d60983f65866"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6ce4ff41-a2c1-5caa-980b-d60983f65866",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "89cfa520-4957-5463-bd86-8aee62eaf39d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "89cfa520-4957-5463-bd86-8aee62eaf39d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "df986755-3d3f-5326-a5c8-ba05792696ce"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "df986755-3d3f-5326-a5c8-ba05792696ce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "cd81cdd8-e3cc-5926-8c13-a936ab352780"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cd81cdd8-e3cc-5926-8c13-a936ab352780",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "cd45280c-d012-50d9-943f-4b6db4d90660"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cd45280c-d012-50d9-943f-4b6db4d90660",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "b9275732-b326-5edb-a23e-489d11f0667f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b9275732-b326-5edb-a23e-489d11f0667f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "69161744-fa9d-57b8-8931-dbce8dac36fa"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "69161744-fa9d-57b8-8931-dbce8dac36fa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "cf99e971-47e4-46cc-812f-e5ecb74c68aa"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "cf99e971-47e4-46cc-812f-e5ecb74c68aa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "cf99e971-47e4-46cc-812f-e5ecb74c68aa",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "d0bf7a32-23de-4dbe-8d9d-2fd508199cb6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "d9fe079f-538a-477f-ba47-fb0e4ede9b8a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "a4177c69-0ebc-4d38-aef4-e8e1d1d0917e"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a4177c69-0ebc-4d38-aef4-e8e1d1d0917e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a4177c69-0ebc-4d38-aef4-e8e1d1d0917e",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "ee065e69-5e1e-485b-8c30-b5b993b14a6e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "6c7095a7-e5a9-40ee-a31b-b3eefc30ab4c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "56344453-fb80-49e8-beab-3b32e30d6861"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "56344453-fb80-49e8-beab-3b32e30d6861",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "56344453-fb80-49e8-beab-3b32e30d6861",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a6308d26-b4f1-4935-a950-8637a2e3c001",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "5b76f1b7-582f-48a5-b79d-7f09c50c8f7c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "c9ed82cf-cc2a-4a60-9bc8-f26591e6d574"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "c9ed82cf-cc2a-4a60-9bc8-f26591e6d574",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:44.652 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:09:44.652 Malloc1p0 00:09:44.652 Malloc1p1 00:09:44.652 Malloc2p0 00:09:44.652 Malloc2p1 00:09:44.652 Malloc2p2 00:09:44.652 Malloc2p3 00:09:44.652 Malloc2p4 00:09:44.652 Malloc2p5 00:09:44.652 Malloc2p6 00:09:44.652 Malloc2p7 00:09:44.652 TestPT 00:09:44.652 raid0 00:09:44.652 concat0 ]] 00:09:44.652 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "da7d2bbe-1dd6-4ece-8a52-d9553dc7ea3d"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "da7d2bbe-1dd6-4ece-8a52-d9553dc7ea3d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "db1897e5-39ac-5768-81dd-3d3a79e4b3a5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "db1897e5-39ac-5768-81dd-3d3a79e4b3a5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "8b4d0916-cd51-57c8-88c0-8cadbeebfc1a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "8b4d0916-cd51-57c8-88c0-8cadbeebfc1a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "578a9574-c863-5868-a858-59e89633b0cf"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "578a9574-c863-5868-a858-59e89633b0cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "bf012b4b-9dc4-5601-b7d6-217091c05cd0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bf012b4b-9dc4-5601-b7d6-217091c05cd0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "6ce4ff41-a2c1-5caa-980b-d60983f65866"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6ce4ff41-a2c1-5caa-980b-d60983f65866",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "89cfa520-4957-5463-bd86-8aee62eaf39d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "89cfa520-4957-5463-bd86-8aee62eaf39d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "df986755-3d3f-5326-a5c8-ba05792696ce"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "df986755-3d3f-5326-a5c8-ba05792696ce",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "cd81cdd8-e3cc-5926-8c13-a936ab352780"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cd81cdd8-e3cc-5926-8c13-a936ab352780",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "cd45280c-d012-50d9-943f-4b6db4d90660"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "cd45280c-d012-50d9-943f-4b6db4d90660",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "b9275732-b326-5edb-a23e-489d11f0667f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b9275732-b326-5edb-a23e-489d11f0667f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "69161744-fa9d-57b8-8931-dbce8dac36fa"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "69161744-fa9d-57b8-8931-dbce8dac36fa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "cf99e971-47e4-46cc-812f-e5ecb74c68aa"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "cf99e971-47e4-46cc-812f-e5ecb74c68aa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "cf99e971-47e4-46cc-812f-e5ecb74c68aa",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "d0bf7a32-23de-4dbe-8d9d-2fd508199cb6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "d9fe079f-538a-477f-ba47-fb0e4ede9b8a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "a4177c69-0ebc-4d38-aef4-e8e1d1d0917e"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "a4177c69-0ebc-4d38-aef4-e8e1d1d0917e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "a4177c69-0ebc-4d38-aef4-e8e1d1d0917e",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "ee065e69-5e1e-485b-8c30-b5b993b14a6e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "6c7095a7-e5a9-40ee-a31b-b3eefc30ab4c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "56344453-fb80-49e8-beab-3b32e30d6861"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "56344453-fb80-49e8-beab-3b32e30d6861",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "56344453-fb80-49e8-beab-3b32e30d6861",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a6308d26-b4f1-4935-a950-8637a2e3c001",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "5b76f1b7-582f-48a5-b79d-7f09c50c8f7c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "c9ed82cf-cc2a-4a60-9bc8-f26591e6d574"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "c9ed82cf-cc2a-4a60-9bc8-f26591e6d574",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:09:44.653 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:09:44.654 20:23:36 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:44.654 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:44.654 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:44.654 20:23:36 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:44.654 ************************************ 00:09:44.654 START TEST bdev_fio_trim 00:09:44.654 ************************************ 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:44.654 20:23:36 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:44.654 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:44.654 fio-3.35 00:09:44.654 Starting 14 threads 00:09:56.889 00:09:56.889 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=1335399: Mon Jul 15 20:23:47 2024 00:09:56.889 write: IOPS=124k, BW=484MiB/s (507MB/s)(4839MiB/10001msec); 0 zone resets 00:09:56.889 slat (usec): min=5, max=3496, avg=40.08, stdev=11.20 00:09:56.889 clat (usec): min=15, max=3842, avg=282.46, stdev=96.46 00:09:56.889 lat (usec): min=35, max=3869, avg=322.54, stdev=100.42 00:09:56.889 clat percentiles (usec): 00:09:56.889 | 50.000th=[ 273], 99.000th=[ 506], 99.900th=[ 562], 99.990th=[ 603], 00:09:56.889 | 99.999th=[ 1020] 00:09:56.889 bw ( KiB/s): min=422872, max=616855, per=100.00%, avg=496242.79, stdev=3250.08, samples=266 00:09:56.889 iops : min=105718, max=154213, avg=124060.63, stdev=812.52, samples=266 00:09:56.889 trim: IOPS=124k, BW=484MiB/s (507MB/s)(4839MiB/10001msec); 0 zone resets 00:09:56.889 slat (usec): min=5, max=701, avg=26.84, stdev= 6.78 00:09:56.889 clat (usec): min=9, max=3869, avg=322.70, stdev=100.44 00:09:56.889 lat (usec): min=24, max=3892, avg=349.53, stdev=103.01 00:09:56.889 clat percentiles (usec): 00:09:56.889 | 50.000th=[ 314], 99.000th=[ 553], 99.900th=[ 619], 99.990th=[ 668], 00:09:56.889 | 99.999th=[ 1156] 00:09:56.889 bw ( KiB/s): min=422872, max=616855, per=100.00%, avg=496242.79, stdev=3250.08, samples=266 00:09:56.889 iops : min=105718, max=154213, avg=124060.63, stdev=812.52, samples=266 00:09:56.889 lat (usec) : 10=0.01%, 20=0.01%, 50=0.01%, 100=0.49%, 250=33.76% 00:09:56.889 lat (usec) : 500=62.92%, 750=2.83%, 1000=0.01% 00:09:56.889 lat (msec) : 2=0.01%, 4=0.01% 00:09:56.889 cpu : usr=99.61%, sys=0.01%, ctx=547, majf=0, minf=1078 00:09:56.889 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:56.889 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:56.889 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:56.889 issued rwts: total=0,1238731,1238735,0 short=0,0,0,0 dropped=0,0,0,0 00:09:56.889 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:56.889 00:09:56.889 Run status group 0 (all jobs): 00:09:56.889 WRITE: bw=484MiB/s (507MB/s), 484MiB/s-484MiB/s (507MB/s-507MB/s), io=4839MiB (5074MB), run=10001-10001msec 00:09:56.889 TRIM: bw=484MiB/s (507MB/s), 484MiB/s-484MiB/s (507MB/s-507MB/s), io=4839MiB (5074MB), run=10001-10001msec 00:09:56.889 00:09:56.889 real 0m11.682s 00:09:56.889 user 2m26.301s 00:09:56.889 sys 0m0.928s 00:09:56.889 20:23:47 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:56.889 20:23:47 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:56.889 ************************************ 00:09:56.889 END TEST bdev_fio_trim 00:09:56.889 ************************************ 00:09:56.889 20:23:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:56.889 20:23:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:56.889 20:23:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:56.889 20:23:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:56.889 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:56.889 20:23:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:56.889 00:09:56.889 real 0m23.927s 00:09:56.889 user 5m12.158s 00:09:56.889 sys 0m2.528s 00:09:56.889 20:23:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:56.889 20:23:48 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:56.889 ************************************ 00:09:56.889 END TEST bdev_fio 00:09:56.889 ************************************ 00:09:56.889 20:23:48 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:56.889 20:23:48 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:56.889 20:23:48 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:56.889 20:23:48 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:56.889 20:23:48 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:56.889 20:23:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:56.889 ************************************ 00:09:56.889 START TEST bdev_verify 00:09:56.889 ************************************ 00:09:56.889 20:23:48 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:56.889 [2024-07-15 20:23:48.187196] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:09:56.889 [2024-07-15 20:23:48.187260] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1336851 ] 00:09:56.889 [2024-07-15 20:23:48.320651] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:56.889 [2024-07-15 20:23:48.426686] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:56.889 [2024-07-15 20:23:48.426690] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:56.889 [2024-07-15 20:23:48.583234] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:56.889 [2024-07-15 20:23:48.583301] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:56.889 [2024-07-15 20:23:48.583316] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:56.889 [2024-07-15 20:23:48.591238] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:56.889 [2024-07-15 20:23:48.591264] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:56.889 [2024-07-15 20:23:48.599250] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:56.889 [2024-07-15 20:23:48.599274] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:56.889 [2024-07-15 20:23:48.676475] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:56.889 [2024-07-15 20:23:48.676538] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:56.889 [2024-07-15 20:23:48.676555] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b97770 00:09:56.889 [2024-07-15 20:23:48.676568] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:56.889 [2024-07-15 20:23:48.678220] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:56.889 [2024-07-15 20:23:48.678250] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:56.889 Running I/O for 5 seconds... 00:10:02.189 00:10:02.189 Latency(us) 00:10:02.189 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:02.189 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x1000 00:10:02.189 Malloc0 : 5.18 1162.28 4.54 0.00 0.00 109913.78 762.21 351956.81 00:10:02.189 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x1000 length 0x1000 00:10:02.189 Malloc0 : 5.17 940.73 3.67 0.00 0.00 135757.42 673.17 408488.74 00:10:02.189 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x800 00:10:02.189 Malloc1p0 : 5.22 613.21 2.40 0.00 0.00 207814.27 2521.71 178713.82 00:10:02.189 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x800 length 0x800 00:10:02.189 Malloc1p0 : 5.17 494.86 1.93 0.00 0.00 257328.66 3177.07 222480.47 00:10:02.189 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x800 00:10:02.189 Malloc1p1 : 5.22 612.96 2.39 0.00 0.00 207471.22 2478.97 177802.02 00:10:02.189 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x800 length 0x800 00:10:02.189 Malloc1p1 : 5.18 494.57 1.93 0.00 0.00 256836.30 3162.82 222480.47 00:10:02.189 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x200 00:10:02.189 Malloc2p0 : 5.22 612.71 2.39 0.00 0.00 207115.80 2478.97 177802.02 00:10:02.189 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x200 length 0x200 00:10:02.189 Malloc2p0 : 5.18 494.06 1.93 0.00 0.00 256466.54 3818.18 221568.67 00:10:02.189 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x200 00:10:02.189 Malloc2p1 : 5.22 612.46 2.39 0.00 0.00 206776.35 3262.55 175066.60 00:10:02.189 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x200 length 0x200 00:10:02.189 Malloc2p1 : 5.18 493.79 1.93 0.00 0.00 255840.89 4074.63 218833.25 00:10:02.189 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x200 00:10:02.189 Malloc2p2 : 5.23 612.21 2.39 0.00 0.00 206350.17 3533.25 170507.58 00:10:02.189 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x200 length 0x200 00:10:02.189 Malloc2p2 : 5.19 493.51 1.93 0.00 0.00 255153.01 3376.53 216097.84 00:10:02.189 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x200 00:10:02.189 Malloc2p3 : 5.23 611.96 2.39 0.00 0.00 205898.23 2721.17 168683.97 00:10:02.189 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x200 length 0x200 00:10:02.189 Malloc2p3 : 5.19 493.23 1.93 0.00 0.00 254619.34 3162.82 215186.03 00:10:02.189 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x200 00:10:02.189 Malloc2p4 : 5.23 611.71 2.39 0.00 0.00 205536.82 2450.48 169595.77 00:10:02.189 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x200 length 0x200 00:10:02.189 Malloc2p4 : 5.19 492.96 1.93 0.00 0.00 254130.18 3632.97 215186.03 00:10:02.189 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x200 00:10:02.189 Malloc2p5 : 5.23 611.47 2.39 0.00 0.00 205190.93 2450.48 170507.58 00:10:02.189 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x200 length 0x200 00:10:02.189 Malloc2p5 : 5.20 492.68 1.92 0.00 0.00 253537.12 4188.61 213362.42 00:10:02.189 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x200 00:10:02.189 Malloc2p6 : 5.24 611.22 2.39 0.00 0.00 204843.15 3048.85 170507.58 00:10:02.189 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x200 length 0x200 00:10:02.189 Malloc2p6 : 5.20 492.41 1.92 0.00 0.00 252847.33 3006.11 212450.62 00:10:02.189 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x200 00:10:02.189 Malloc2p7 : 5.24 610.98 2.39 0.00 0.00 204435.14 3732.70 166860.35 00:10:02.189 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x200 length 0x200 00:10:02.189 Malloc2p7 : 5.26 510.82 2.00 0.00 0.00 243123.83 3632.97 206067.98 00:10:02.189 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x1000 00:10:02.189 TestPT : 5.24 591.37 2.31 0.00 0.00 209533.44 12195.39 165948.55 00:10:02.189 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x1000 length 0x1000 00:10:02.189 TestPT : 5.25 488.01 1.91 0.00 0.00 253635.94 33508.84 206979.78 00:10:02.189 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x2000 00:10:02.189 raid0 : 5.24 610.59 2.39 0.00 0.00 203435.21 3120.08 155006.89 00:10:02.189 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x2000 length 0x2000 00:10:02.189 raid0 : 5.26 510.57 1.99 0.00 0.00 241860.39 3291.05 190567.29 00:10:02.189 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x2000 00:10:02.189 concat0 : 5.24 610.33 2.38 0.00 0.00 203014.85 2350.75 155918.69 00:10:02.189 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x2000 length 0x2000 00:10:02.189 concat0 : 5.27 510.33 1.99 0.00 0.00 241220.92 3390.78 186920.07 00:10:02.189 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x1000 00:10:02.189 raid1 : 5.25 610.00 2.38 0.00 0.00 202668.50 3091.59 164124.94 00:10:02.189 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x1000 length 0x1000 00:10:02.189 raid1 : 5.27 510.08 1.99 0.00 0.00 240493.78 4103.12 192390.90 00:10:02.189 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x0 length 0x4e2 00:10:02.189 AIO0 : 5.25 609.45 2.38 0.00 0.00 202421.01 1503.05 170507.58 00:10:02.189 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:02.189 Verification LBA range: start 0x4e2 length 0x4e2 00:10:02.189 AIO0 : 5.27 509.89 1.99 0.00 0.00 239817.99 1659.77 196949.93 00:10:02.189 =================================================================================================================== 00:10:02.189 Total : 18737.42 73.19 0.00 0.00 214033.60 673.17 408488.74 00:10:02.448 00:10:02.448 real 0m6.510s 00:10:02.448 user 0m12.015s 00:10:02.448 sys 0m0.424s 00:10:02.448 20:23:54 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:02.448 20:23:54 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:10:02.448 ************************************ 00:10:02.448 END TEST bdev_verify 00:10:02.448 ************************************ 00:10:02.448 20:23:54 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:02.448 20:23:54 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:02.448 20:23:54 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:02.448 20:23:54 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:02.448 20:23:54 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:02.448 ************************************ 00:10:02.448 START TEST bdev_verify_big_io 00:10:02.448 ************************************ 00:10:02.448 20:23:54 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:02.448 [2024-07-15 20:23:54.778940] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:10:02.448 [2024-07-15 20:23:54.779005] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1337736 ] 00:10:02.707 [2024-07-15 20:23:54.908735] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:02.707 [2024-07-15 20:23:55.014337] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:02.707 [2024-07-15 20:23:55.014341] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:02.966 [2024-07-15 20:23:55.168570] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:02.966 [2024-07-15 20:23:55.168637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:02.966 [2024-07-15 20:23:55.168651] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:02.966 [2024-07-15 20:23:55.176575] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:02.966 [2024-07-15 20:23:55.176602] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:02.966 [2024-07-15 20:23:55.184589] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:02.966 [2024-07-15 20:23:55.184613] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:02.966 [2024-07-15 20:23:55.261726] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:02.966 [2024-07-15 20:23:55.261783] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:02.966 [2024-07-15 20:23:55.261800] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1286770 00:10:02.966 [2024-07-15 20:23:55.261812] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:02.966 [2024-07-15 20:23:55.263530] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:02.966 [2024-07-15 20:23:55.263560] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:03.225 [2024-07-15 20:23:55.433889] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.435228] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.437178] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.438492] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.440493] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.441795] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.443716] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.445372] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.446366] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.447892] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.448868] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.450401] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.451421] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.452985] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.453971] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.455434] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:03.225 [2024-07-15 20:23:55.479376] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:03.225 [2024-07-15 20:23:55.481306] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:03.225 Running I/O for 5 seconds... 00:10:11.346 00:10:11.346 Latency(us) 00:10:11.346 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:11.346 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x100 00:10:11.346 Malloc0 : 5.63 159.12 9.94 0.00 0.00 786985.94 865.50 1998677.04 00:10:11.346 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x100 length 0x100 00:10:11.346 Malloc0 : 6.76 132.57 8.29 0.00 0.00 815771.87 1104.14 1685016.04 00:10:11.346 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x80 00:10:11.346 Malloc1p0 : 6.73 33.28 2.08 0.00 0.00 3448842.29 1495.93 5572953.49 00:10:11.346 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x80 length 0x80 00:10:11.346 Malloc1p0 : 7.23 30.99 1.94 0.00 0.00 3310188.01 1795.12 5631309.02 00:10:11.346 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x80 00:10:11.346 Malloc1p1 : 6.78 35.37 2.21 0.00 0.00 3185514.51 1538.67 5397886.89 00:10:11.346 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x80 length 0x80 00:10:11.346 Malloc1p1 : 7.23 30.98 1.94 0.00 0.00 3167910.50 1816.49 5397886.89 00:10:11.346 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x20 00:10:11.346 Malloc2p0 : 6.30 22.84 1.43 0.00 0.00 1219482.33 637.55 2246688.06 00:10:11.346 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x20 length 0x20 00:10:11.346 Malloc2p0 : 7.12 20.23 1.26 0.00 0.00 1194457.38 755.09 1969499.27 00:10:11.346 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x20 00:10:11.346 Malloc2p1 : 6.42 24.91 1.56 0.00 0.00 1128196.20 648.24 2217510.29 00:10:11.346 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x20 length 0x20 00:10:11.346 Malloc2p1 : 7.12 20.23 1.26 0.00 0.00 1181812.19 765.77 1940321.50 00:10:11.346 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x20 00:10:11.346 Malloc2p2 : 6.42 24.91 1.56 0.00 0.00 1118445.62 626.87 2202921.41 00:10:11.346 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x20 length 0x20 00:10:11.346 Malloc2p2 : 7.12 20.22 1.26 0.00 0.00 1168165.11 783.58 1911143.74 00:10:11.346 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x20 00:10:11.346 Malloc2p3 : 6.42 24.90 1.56 0.00 0.00 1108740.64 633.99 2173743.64 00:10:11.346 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x20 length 0x20 00:10:11.346 Malloc2p3 : 7.18 22.29 1.39 0.00 0.00 1052271.20 787.14 1867377.09 00:10:11.346 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x20 00:10:11.346 Malloc2p4 : 6.43 24.90 1.56 0.00 0.00 1097267.50 637.55 2144565.87 00:10:11.346 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x20 length 0x20 00:10:11.346 Malloc2p4 : 7.18 22.29 1.39 0.00 0.00 1040583.33 769.34 1845493.76 00:10:11.346 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x20 00:10:11.346 Malloc2p5 : 6.43 24.89 1.56 0.00 0.00 1086764.80 633.99 2115388.10 00:10:11.346 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x20 length 0x20 00:10:11.346 Malloc2p5 : 7.18 22.28 1.39 0.00 0.00 1029117.34 762.21 1809021.55 00:10:11.346 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x20 00:10:11.346 Malloc2p6 : 6.43 24.89 1.56 0.00 0.00 1075909.98 623.30 2086210.34 00:10:11.346 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x20 length 0x20 00:10:11.346 Malloc2p6 : 7.18 22.28 1.39 0.00 0.00 1017063.99 765.77 1787138.23 00:10:11.346 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x20 00:10:11.346 Malloc2p7 : 6.43 24.88 1.56 0.00 0.00 1064603.17 641.11 2057032.57 00:10:11.346 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x20 length 0x20 00:10:11.346 Malloc2p7 : 7.23 24.34 1.52 0.00 0.00 924438.64 776.46 1757960.46 00:10:11.346 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x100 00:10:11.346 TestPT : 6.84 35.36 2.21 0.00 0.00 2875367.39 94371.84 3763931.94 00:10:11.346 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x100 length 0x100 00:10:11.346 TestPT : 7.24 57.46 3.59 0.00 0.00 1527023.34 3276.80 3968176.31 00:10:11.346 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:11.346 Verification LBA range: start 0x0 length 0x200 00:10:11.347 raid0 : 6.74 40.38 2.52 0.00 0.00 2448128.72 1574.29 4755976.01 00:10:11.347 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:11.347 Verification LBA range: start 0x200 length 0x200 00:10:11.347 raid0 : 7.04 60.83 3.80 0.00 0.00 1915461.75 3120.08 3690987.52 00:10:11.347 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:11.347 Verification LBA range: start 0x0 length 0x200 00:10:11.347 concat0 : 6.79 47.14 2.95 0.00 0.00 2036431.48 1545.79 4551731.65 00:10:11.347 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:11.347 Verification LBA range: start 0x200 length 0x200 00:10:11.347 concat0 : 7.24 28.72 1.80 0.00 0.00 4072985.82 5043.42 6535819.80 00:10:11.347 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:11.347 Verification LBA range: start 0x0 length 0x100 00:10:11.347 raid1 : 6.85 53.75 3.36 0.00 0.00 1737961.58 2023.07 4347487.28 00:10:11.347 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:11.347 Verification LBA range: start 0x100 length 0x100 00:10:11.347 raid1 : 7.24 28.72 1.79 0.00 0.00 3918001.03 2535.96 6302397.66 00:10:11.347 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:10:11.347 Verification LBA range: start 0x0 length 0x4e 00:10:11.347 AIO0 : 7.04 65.04 4.07 0.00 0.00 856961.11 815.64 2961543.35 00:10:11.347 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:10:11.347 Verification LBA range: start 0x4e length 0x4e 00:10:11.347 AIO0 : 7.11 24.46 1.53 0.00 0.00 2764442.90 968.79 4785153.78 00:10:11.347 =================================================================================================================== 00:10:11.347 Total : 1235.48 77.22 0.00 0.00 1661565.87 623.30 6535819.80 00:10:11.347 00:10:11.347 real 0m8.541s 00:10:11.347 user 0m16.038s 00:10:11.347 sys 0m0.447s 00:10:11.347 20:24:03 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:11.347 20:24:03 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:10:11.347 ************************************ 00:10:11.347 END TEST bdev_verify_big_io 00:10:11.347 ************************************ 00:10:11.347 20:24:03 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:11.347 20:24:03 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:11.347 20:24:03 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:11.347 20:24:03 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:11.347 20:24:03 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:11.347 ************************************ 00:10:11.347 START TEST bdev_write_zeroes 00:10:11.347 ************************************ 00:10:11.347 20:24:03 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:11.347 [2024-07-15 20:24:03.407751] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:10:11.347 [2024-07-15 20:24:03.407818] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338940 ] 00:10:11.347 [2024-07-15 20:24:03.543073] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:11.347 [2024-07-15 20:24:03.647572] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:11.607 [2024-07-15 20:24:03.813484] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:11.607 [2024-07-15 20:24:03.813554] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:11.607 [2024-07-15 20:24:03.813570] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:11.607 [2024-07-15 20:24:03.821487] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:11.607 [2024-07-15 20:24:03.821515] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:11.607 [2024-07-15 20:24:03.829495] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:11.607 [2024-07-15 20:24:03.829520] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:11.607 [2024-07-15 20:24:03.906920] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:11.607 [2024-07-15 20:24:03.906988] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:11.607 [2024-07-15 20:24:03.907005] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x283ee50 00:10:11.607 [2024-07-15 20:24:03.907018] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:11.607 [2024-07-15 20:24:03.908539] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:11.607 [2024-07-15 20:24:03.908571] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:11.865 Running I/O for 1 seconds... 00:10:13.243 00:10:13.243 Latency(us) 00:10:13.243 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:13.243 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc0 : 1.03 4981.68 19.46 0.00 0.00 25692.27 658.92 43082.80 00:10:13.243 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc1p0 : 1.03 4974.46 19.43 0.00 0.00 25683.09 911.81 42170.99 00:10:13.243 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc1p1 : 1.05 4977.53 19.44 0.00 0.00 25611.64 911.81 41259.19 00:10:13.243 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc2p0 : 1.06 4970.45 19.42 0.00 0.00 25591.08 911.81 40347.38 00:10:13.243 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc2p1 : 1.06 4963.46 19.39 0.00 0.00 25568.94 908.24 39435.58 00:10:13.243 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc2p2 : 1.06 4956.53 19.36 0.00 0.00 25548.86 908.24 38523.77 00:10:13.243 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc2p3 : 1.06 4949.50 19.33 0.00 0.00 25526.97 911.81 37611.97 00:10:13.243 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc2p4 : 1.06 4942.60 19.31 0.00 0.00 25503.19 908.24 36700.16 00:10:13.243 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc2p5 : 1.06 4935.66 19.28 0.00 0.00 25480.59 911.81 35788.35 00:10:13.243 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc2p6 : 1.06 4928.72 19.25 0.00 0.00 25457.23 911.81 34876.55 00:10:13.243 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 Malloc2p7 : 1.07 4921.83 19.23 0.00 0.00 25436.89 908.24 33964.74 00:10:13.243 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 TestPT : 1.07 4915.02 19.20 0.00 0.00 25419.62 940.30 33052.94 00:10:13.243 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 raid0 : 1.07 4907.08 19.17 0.00 0.00 25394.16 1617.03 31457.28 00:10:13.243 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 concat0 : 1.07 4899.36 19.14 0.00 0.00 25343.47 1609.91 29861.62 00:10:13.243 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 raid1 : 1.07 4889.65 19.10 0.00 0.00 25280.68 2564.45 27240.18 00:10:13.243 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:13.243 AIO0 : 1.07 4883.74 19.08 0.00 0.00 25194.39 1068.52 26442.35 00:10:13.243 =================================================================================================================== 00:10:13.243 Total : 78997.27 308.58 0.00 0.00 25482.69 658.92 43082.80 00:10:13.503 00:10:13.503 real 0m2.298s 00:10:13.503 user 0m1.834s 00:10:13.503 sys 0m0.366s 00:10:13.503 20:24:05 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:13.503 20:24:05 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:10:13.503 ************************************ 00:10:13.503 END TEST bdev_write_zeroes 00:10:13.503 ************************************ 00:10:13.503 20:24:05 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:13.503 20:24:05 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:13.503 20:24:05 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:13.503 20:24:05 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:13.503 20:24:05 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:13.503 ************************************ 00:10:13.503 START TEST bdev_json_nonenclosed 00:10:13.503 ************************************ 00:10:13.503 20:24:05 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:13.503 [2024-07-15 20:24:05.837691] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:10:13.503 [2024-07-15 20:24:05.837819] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1339578 ] 00:10:13.762 [2024-07-15 20:24:06.033628] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:13.762 [2024-07-15 20:24:06.133935] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:13.762 [2024-07-15 20:24:06.134009] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:10:13.762 [2024-07-15 20:24:06.134030] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:13.762 [2024-07-15 20:24:06.134042] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:14.020 00:10:14.020 real 0m0.511s 00:10:14.020 user 0m0.279s 00:10:14.020 sys 0m0.227s 00:10:14.020 20:24:06 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:10:14.020 20:24:06 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:14.020 20:24:06 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:10:14.020 ************************************ 00:10:14.020 END TEST bdev_json_nonenclosed 00:10:14.020 ************************************ 00:10:14.020 20:24:06 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:14.020 20:24:06 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:10:14.020 20:24:06 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:14.020 20:24:06 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:14.020 20:24:06 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:14.020 20:24:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:14.020 ************************************ 00:10:14.020 START TEST bdev_json_nonarray 00:10:14.020 ************************************ 00:10:14.020 20:24:06 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:14.020 [2024-07-15 20:24:06.392142] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:10:14.021 [2024-07-15 20:24:06.392206] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1339698 ] 00:10:14.279 [2024-07-15 20:24:06.519768] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:14.279 [2024-07-15 20:24:06.617675] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:14.279 [2024-07-15 20:24:06.617750] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:10:14.279 [2024-07-15 20:24:06.617770] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:14.279 [2024-07-15 20:24:06.617783] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:14.538 00:10:14.538 real 0m0.387s 00:10:14.538 user 0m0.231s 00:10:14.538 sys 0m0.153s 00:10:14.538 20:24:06 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:10:14.538 20:24:06 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:14.538 20:24:06 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:10:14.538 ************************************ 00:10:14.538 END TEST bdev_json_nonarray 00:10:14.538 ************************************ 00:10:14.538 20:24:06 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:14.538 20:24:06 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:10:14.538 20:24:06 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:10:14.538 20:24:06 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:10:14.538 20:24:06 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:14.538 20:24:06 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:14.538 20:24:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:14.538 ************************************ 00:10:14.538 START TEST bdev_qos 00:10:14.538 ************************************ 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=1339894 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 1339894' 00:10:14.538 Process qos testing pid: 1339894 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 1339894 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 1339894 ']' 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:14.538 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:14.538 20:24:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:14.538 [2024-07-15 20:24:06.868164] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:10:14.538 [2024-07-15 20:24:06.868233] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1339894 ] 00:10:14.797 [2024-07-15 20:24:07.003200] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:14.797 [2024-07-15 20:24:07.119828] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:15.732 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:15.732 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:10:15.732 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:10:15.732 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.732 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:15.990 Malloc_0 00:10:15.990 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.990 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:10:15.990 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:10:15.990 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:15.990 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:10:15.990 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:15.990 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:15.990 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:15.991 [ 00:10:15.991 { 00:10:15.991 "name": "Malloc_0", 00:10:15.991 "aliases": [ 00:10:15.991 "6d217842-2677-4b41-8ce6-45385cb1bfdd" 00:10:15.991 ], 00:10:15.991 "product_name": "Malloc disk", 00:10:15.991 "block_size": 512, 00:10:15.991 "num_blocks": 262144, 00:10:15.991 "uuid": "6d217842-2677-4b41-8ce6-45385cb1bfdd", 00:10:15.991 "assigned_rate_limits": { 00:10:15.991 "rw_ios_per_sec": 0, 00:10:15.991 "rw_mbytes_per_sec": 0, 00:10:15.991 "r_mbytes_per_sec": 0, 00:10:15.991 "w_mbytes_per_sec": 0 00:10:15.991 }, 00:10:15.991 "claimed": false, 00:10:15.991 "zoned": false, 00:10:15.991 "supported_io_types": { 00:10:15.991 "read": true, 00:10:15.991 "write": true, 00:10:15.991 "unmap": true, 00:10:15.991 "flush": true, 00:10:15.991 "reset": true, 00:10:15.991 "nvme_admin": false, 00:10:15.991 "nvme_io": false, 00:10:15.991 "nvme_io_md": false, 00:10:15.991 "write_zeroes": true, 00:10:15.991 "zcopy": true, 00:10:15.991 "get_zone_info": false, 00:10:15.991 "zone_management": false, 00:10:15.991 "zone_append": false, 00:10:15.991 "compare": false, 00:10:15.991 "compare_and_write": false, 00:10:15.991 "abort": true, 00:10:15.991 "seek_hole": false, 00:10:15.991 "seek_data": false, 00:10:15.991 "copy": true, 00:10:15.991 "nvme_iov_md": false 00:10:15.991 }, 00:10:15.991 "memory_domains": [ 00:10:15.991 { 00:10:15.991 "dma_device_id": "system", 00:10:15.991 "dma_device_type": 1 00:10:15.991 }, 00:10:15.991 { 00:10:15.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:15.991 "dma_device_type": 2 00:10:15.991 } 00:10:15.991 ], 00:10:15.991 "driver_specific": {} 00:10:15.991 } 00:10:15.991 ] 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:15.991 Null_1 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:15.991 [ 00:10:15.991 { 00:10:15.991 "name": "Null_1", 00:10:15.991 "aliases": [ 00:10:15.991 "d1f9b6c6-53fd-451b-bf28-1c30e0ef8bd1" 00:10:15.991 ], 00:10:15.991 "product_name": "Null disk", 00:10:15.991 "block_size": 512, 00:10:15.991 "num_blocks": 262144, 00:10:15.991 "uuid": "d1f9b6c6-53fd-451b-bf28-1c30e0ef8bd1", 00:10:15.991 "assigned_rate_limits": { 00:10:15.991 "rw_ios_per_sec": 0, 00:10:15.991 "rw_mbytes_per_sec": 0, 00:10:15.991 "r_mbytes_per_sec": 0, 00:10:15.991 "w_mbytes_per_sec": 0 00:10:15.991 }, 00:10:15.991 "claimed": false, 00:10:15.991 "zoned": false, 00:10:15.991 "supported_io_types": { 00:10:15.991 "read": true, 00:10:15.991 "write": true, 00:10:15.991 "unmap": false, 00:10:15.991 "flush": false, 00:10:15.991 "reset": true, 00:10:15.991 "nvme_admin": false, 00:10:15.991 "nvme_io": false, 00:10:15.991 "nvme_io_md": false, 00:10:15.991 "write_zeroes": true, 00:10:15.991 "zcopy": false, 00:10:15.991 "get_zone_info": false, 00:10:15.991 "zone_management": false, 00:10:15.991 "zone_append": false, 00:10:15.991 "compare": false, 00:10:15.991 "compare_and_write": false, 00:10:15.991 "abort": true, 00:10:15.991 "seek_hole": false, 00:10:15.991 "seek_data": false, 00:10:15.991 "copy": false, 00:10:15.991 "nvme_iov_md": false 00:10:15.991 }, 00:10:15.991 "driver_specific": {} 00:10:15.991 } 00:10:15.991 ] 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:15.991 20:24:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:10:15.991 Running I/O for 60 seconds... 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 48843.11 195372.44 0.00 0.00 196608.00 0.00 0.00 ' 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=48843.11 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 48843 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=48843 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=12000 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 12000 -gt 1000 ']' 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 12000 Malloc_0 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 12000 IOPS Malloc_0 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:21.285 20:24:13 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:21.285 ************************************ 00:10:21.285 START TEST bdev_qos_iops 00:10:21.285 ************************************ 00:10:21.285 20:24:13 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 12000 IOPS Malloc_0 00:10:21.285 20:24:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=12000 00:10:21.285 20:24:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:21.285 20:24:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:10:21.285 20:24:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:10:21.285 20:24:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:21.285 20:24:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:21.285 20:24:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:21.285 20:24:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:21.285 20:24:13 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 11995.43 47981.71 0.00 0.00 49440.00 0.00 0.00 ' 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=11995.43 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 11995 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=11995 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=10800 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=13200 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 11995 -lt 10800 ']' 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 11995 -gt 13200 ']' 00:10:26.596 00:10:26.596 real 0m5.312s 00:10:26.596 user 0m0.111s 00:10:26.596 sys 0m0.055s 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:26.596 20:24:18 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:10:26.596 ************************************ 00:10:26.596 END TEST bdev_qos_iops 00:10:26.596 ************************************ 00:10:26.596 20:24:18 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:26.596 20:24:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:10:26.596 20:24:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:26.596 20:24:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:26.596 20:24:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:26.596 20:24:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:26.596 20:24:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:26.596 20:24:18 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 15396.96 61587.83 0.00 0.00 63488.00 0.00 0.00 ' 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=63488.00 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 63488 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=63488 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=6 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 6 -lt 2 ']' 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 6 Null_1 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 6 BANDWIDTH Null_1 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:31.868 20:24:24 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:31.868 ************************************ 00:10:31.868 START TEST bdev_qos_bw 00:10:31.868 ************************************ 00:10:31.868 20:24:24 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 6 BANDWIDTH Null_1 00:10:31.868 20:24:24 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=6 00:10:31.868 20:24:24 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:31.868 20:24:24 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:10:31.868 20:24:24 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:31.868 20:24:24 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:31.868 20:24:24 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:31.868 20:24:24 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:31.868 20:24:24 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:31.868 20:24:24 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 1534.94 6139.75 0.00 0.00 6336.00 0.00 0.00 ' 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=6336.00 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 6336 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=6336 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=6144 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=5529 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=6758 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 6336 -lt 5529 ']' 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 6336 -gt 6758 ']' 00:10:38.434 00:10:38.434 real 0m5.324s 00:10:38.434 user 0m0.124s 00:10:38.434 sys 0m0.044s 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:38.434 20:24:29 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:10:38.434 ************************************ 00:10:38.434 END TEST bdev_qos_bw 00:10:38.435 ************************************ 00:10:38.435 20:24:29 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:38.435 20:24:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:10:38.435 20:24:29 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:38.435 20:24:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:38.435 20:24:29 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:38.435 20:24:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:10:38.435 20:24:29 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:38.435 20:24:29 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:38.435 20:24:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:38.435 ************************************ 00:10:38.435 START TEST bdev_qos_ro_bw 00:10:38.435 ************************************ 00:10:38.435 20:24:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:10:38.435 20:24:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:10:38.435 20:24:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:38.435 20:24:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:10:38.435 20:24:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:38.435 20:24:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:38.435 20:24:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:38.435 20:24:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:38.435 20:24:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:38.435 20:24:29 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.66 2046.64 0.00 0.00 2060.00 0.00 0.00 ' 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:10:42.623 00:10:42.623 real 0m5.194s 00:10:42.623 user 0m0.124s 00:10:42.623 sys 0m0.042s 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:42.623 20:24:34 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:42.623 ************************************ 00:10:42.623 END TEST bdev_qos_ro_bw 00:10:42.623 ************************************ 00:10:42.623 20:24:34 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:42.623 20:24:34 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:42.623 20:24:34 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:42.623 20:24:34 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:43.191 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:43.191 20:24:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:10:43.191 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:43.191 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:43.450 00:10:43.450 Latency(us) 00:10:43.450 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:43.450 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:43.450 Malloc_0 : 27.03 16459.27 64.29 0.00 0.00 15410.28 2493.22 503316.48 00:10:43.450 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:43.450 Null_1 : 27.23 15818.14 61.79 0.00 0.00 16124.61 1004.41 198773.54 00:10:43.450 =================================================================================================================== 00:10:43.450 Total : 32277.41 126.08 0.00 0.00 15761.66 1004.41 503316.48 00:10:43.450 0 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 1339894 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 1339894 ']' 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 1339894 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1339894 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1339894' 00:10:43.450 killing process with pid 1339894 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 1339894 00:10:43.450 Received shutdown signal, test time was about 27.297352 seconds 00:10:43.450 00:10:43.450 Latency(us) 00:10:43.450 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:43.450 =================================================================================================================== 00:10:43.450 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:43.450 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 1339894 00:10:43.710 20:24:35 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:10:43.710 00:10:43.710 real 0m29.174s 00:10:43.710 user 0m30.134s 00:10:43.710 sys 0m0.958s 00:10:43.710 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:43.710 20:24:35 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:43.710 ************************************ 00:10:43.710 END TEST bdev_qos 00:10:43.710 ************************************ 00:10:43.710 20:24:36 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:43.710 20:24:36 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:43.710 20:24:36 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:43.710 20:24:36 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:43.710 20:24:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:43.710 ************************************ 00:10:43.710 START TEST bdev_qd_sampling 00:10:43.710 ************************************ 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=1343692 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 1343692' 00:10:43.710 Process bdev QD sampling period testing pid: 1343692 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 1343692 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 1343692 ']' 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:43.710 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:43.710 20:24:36 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:43.968 [2024-07-15 20:24:36.131303] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:10:43.968 [2024-07-15 20:24:36.131370] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1343692 ] 00:10:43.968 [2024-07-15 20:24:36.260397] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:44.226 [2024-07-15 20:24:36.364322] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:44.226 [2024-07-15 20:24:36.364326] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:44.793 Malloc_QD 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:44.793 [ 00:10:44.793 { 00:10:44.793 "name": "Malloc_QD", 00:10:44.793 "aliases": [ 00:10:44.793 "4a1919d0-8a4a-410b-b242-1b0a054394d9" 00:10:44.793 ], 00:10:44.793 "product_name": "Malloc disk", 00:10:44.793 "block_size": 512, 00:10:44.793 "num_blocks": 262144, 00:10:44.793 "uuid": "4a1919d0-8a4a-410b-b242-1b0a054394d9", 00:10:44.793 "assigned_rate_limits": { 00:10:44.793 "rw_ios_per_sec": 0, 00:10:44.793 "rw_mbytes_per_sec": 0, 00:10:44.793 "r_mbytes_per_sec": 0, 00:10:44.793 "w_mbytes_per_sec": 0 00:10:44.793 }, 00:10:44.793 "claimed": false, 00:10:44.793 "zoned": false, 00:10:44.793 "supported_io_types": { 00:10:44.793 "read": true, 00:10:44.793 "write": true, 00:10:44.793 "unmap": true, 00:10:44.793 "flush": true, 00:10:44.793 "reset": true, 00:10:44.793 "nvme_admin": false, 00:10:44.793 "nvme_io": false, 00:10:44.793 "nvme_io_md": false, 00:10:44.793 "write_zeroes": true, 00:10:44.793 "zcopy": true, 00:10:44.793 "get_zone_info": false, 00:10:44.793 "zone_management": false, 00:10:44.793 "zone_append": false, 00:10:44.793 "compare": false, 00:10:44.793 "compare_and_write": false, 00:10:44.793 "abort": true, 00:10:44.793 "seek_hole": false, 00:10:44.793 "seek_data": false, 00:10:44.793 "copy": true, 00:10:44.793 "nvme_iov_md": false 00:10:44.793 }, 00:10:44.793 "memory_domains": [ 00:10:44.793 { 00:10:44.793 "dma_device_id": "system", 00:10:44.793 "dma_device_type": 1 00:10:44.793 }, 00:10:44.793 { 00:10:44.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:44.793 "dma_device_type": 2 00:10:44.793 } 00:10:44.793 ], 00:10:44.793 "driver_specific": {} 00:10:44.793 } 00:10:44.793 ] 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:10:44.793 20:24:37 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:45.052 Running I/O for 5 seconds... 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:46.956 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:10:46.956 "tick_rate": 2300000000, 00:10:46.956 "ticks": 5416869228942242, 00:10:46.956 "bdevs": [ 00:10:46.956 { 00:10:46.956 "name": "Malloc_QD", 00:10:46.956 "bytes_read": 708882944, 00:10:46.957 "num_read_ops": 173060, 00:10:46.957 "bytes_written": 0, 00:10:46.957 "num_write_ops": 0, 00:10:46.957 "bytes_unmapped": 0, 00:10:46.957 "num_unmap_ops": 0, 00:10:46.957 "bytes_copied": 0, 00:10:46.957 "num_copy_ops": 0, 00:10:46.957 "read_latency_ticks": 2237284984492, 00:10:46.957 "max_read_latency_ticks": 17298502, 00:10:46.957 "min_read_latency_ticks": 244990, 00:10:46.957 "write_latency_ticks": 0, 00:10:46.957 "max_write_latency_ticks": 0, 00:10:46.957 "min_write_latency_ticks": 0, 00:10:46.957 "unmap_latency_ticks": 0, 00:10:46.957 "max_unmap_latency_ticks": 0, 00:10:46.957 "min_unmap_latency_ticks": 0, 00:10:46.957 "copy_latency_ticks": 0, 00:10:46.957 "max_copy_latency_ticks": 0, 00:10:46.957 "min_copy_latency_ticks": 0, 00:10:46.957 "io_error": {}, 00:10:46.957 "queue_depth_polling_period": 10, 00:10:46.957 "queue_depth": 768, 00:10:46.957 "io_time": 30, 00:10:46.957 "weighted_io_time": 20480 00:10:46.957 } 00:10:46.957 ] 00:10:46.957 }' 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:46.957 00:10:46.957 Latency(us) 00:10:46.957 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:46.957 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:46.957 Malloc_QD : 1.99 50403.45 196.89 0.00 0.00 5066.51 1396.20 5556.31 00:10:46.957 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:46.957 Malloc_QD : 1.99 40561.10 158.44 0.00 0.00 6295.00 1232.36 7522.39 00:10:46.957 =================================================================================================================== 00:10:46.957 Total : 90964.54 355.33 0.00 0.00 5614.63 1232.36 7522.39 00:10:46.957 0 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 1343692 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 1343692 ']' 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 1343692 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:46.957 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1343692 00:10:47.216 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:47.216 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:47.216 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1343692' 00:10:47.216 killing process with pid 1343692 00:10:47.216 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 1343692 00:10:47.216 Received shutdown signal, test time was about 2.069328 seconds 00:10:47.216 00:10:47.216 Latency(us) 00:10:47.216 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:47.216 =================================================================================================================== 00:10:47.216 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:47.216 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 1343692 00:10:47.216 20:24:39 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:10:47.216 00:10:47.216 real 0m3.498s 00:10:47.216 user 0m6.873s 00:10:47.216 sys 0m0.450s 00:10:47.216 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:47.216 20:24:39 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:47.216 ************************************ 00:10:47.216 END TEST bdev_qd_sampling 00:10:47.216 ************************************ 00:10:47.476 20:24:39 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:47.476 20:24:39 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:10:47.476 20:24:39 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:47.476 20:24:39 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:47.476 20:24:39 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:47.476 ************************************ 00:10:47.476 START TEST bdev_error 00:10:47.476 ************************************ 00:10:47.476 20:24:39 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:10:47.476 20:24:39 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:10:47.476 20:24:39 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:10:47.476 20:24:39 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:10:47.476 20:24:39 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=1344237 00:10:47.476 20:24:39 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 1344237' 00:10:47.476 Process error testing pid: 1344237 00:10:47.476 20:24:39 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:47.476 20:24:39 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 1344237 00:10:47.476 20:24:39 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 1344237 ']' 00:10:47.476 20:24:39 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:47.476 20:24:39 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:47.476 20:24:39 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:47.476 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:47.476 20:24:39 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:47.476 20:24:39 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:47.476 [2024-07-15 20:24:39.717174] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:10:47.476 [2024-07-15 20:24:39.717245] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1344237 ] 00:10:47.476 [2024-07-15 20:24:39.854858] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:47.735 [2024-07-15 20:24:39.988241] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:48.330 20:24:40 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.330 Dev_1 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:48.330 20:24:40 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:48.330 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.589 [ 00:10:48.589 { 00:10:48.589 "name": "Dev_1", 00:10:48.589 "aliases": [ 00:10:48.589 "c1ef3288-6415-474e-8777-be74e8162545" 00:10:48.589 ], 00:10:48.589 "product_name": "Malloc disk", 00:10:48.589 "block_size": 512, 00:10:48.589 "num_blocks": 262144, 00:10:48.589 "uuid": "c1ef3288-6415-474e-8777-be74e8162545", 00:10:48.589 "assigned_rate_limits": { 00:10:48.589 "rw_ios_per_sec": 0, 00:10:48.589 "rw_mbytes_per_sec": 0, 00:10:48.589 "r_mbytes_per_sec": 0, 00:10:48.589 "w_mbytes_per_sec": 0 00:10:48.589 }, 00:10:48.589 "claimed": false, 00:10:48.589 "zoned": false, 00:10:48.589 "supported_io_types": { 00:10:48.589 "read": true, 00:10:48.589 "write": true, 00:10:48.589 "unmap": true, 00:10:48.589 "flush": true, 00:10:48.589 "reset": true, 00:10:48.589 "nvme_admin": false, 00:10:48.589 "nvme_io": false, 00:10:48.589 "nvme_io_md": false, 00:10:48.589 "write_zeroes": true, 00:10:48.589 "zcopy": true, 00:10:48.589 "get_zone_info": false, 00:10:48.589 "zone_management": false, 00:10:48.589 "zone_append": false, 00:10:48.589 "compare": false, 00:10:48.589 "compare_and_write": false, 00:10:48.589 "abort": true, 00:10:48.589 "seek_hole": false, 00:10:48.589 "seek_data": false, 00:10:48.589 "copy": true, 00:10:48.589 "nvme_iov_md": false 00:10:48.589 }, 00:10:48.589 "memory_domains": [ 00:10:48.589 { 00:10:48.589 "dma_device_id": "system", 00:10:48.589 "dma_device_type": 1 00:10:48.589 }, 00:10:48.589 { 00:10:48.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.589 "dma_device_type": 2 00:10:48.589 } 00:10:48.589 ], 00:10:48.589 "driver_specific": {} 00:10:48.589 } 00:10:48.589 ] 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:48.589 20:24:40 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.589 true 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:48.589 20:24:40 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.589 Dev_2 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:48.589 20:24:40 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:48.589 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.589 [ 00:10:48.589 { 00:10:48.589 "name": "Dev_2", 00:10:48.589 "aliases": [ 00:10:48.589 "de206459-65a8-41e2-b059-151d0a9b0774" 00:10:48.589 ], 00:10:48.589 "product_name": "Malloc disk", 00:10:48.589 "block_size": 512, 00:10:48.589 "num_blocks": 262144, 00:10:48.589 "uuid": "de206459-65a8-41e2-b059-151d0a9b0774", 00:10:48.589 "assigned_rate_limits": { 00:10:48.589 "rw_ios_per_sec": 0, 00:10:48.589 "rw_mbytes_per_sec": 0, 00:10:48.589 "r_mbytes_per_sec": 0, 00:10:48.589 "w_mbytes_per_sec": 0 00:10:48.589 }, 00:10:48.589 "claimed": false, 00:10:48.589 "zoned": false, 00:10:48.589 "supported_io_types": { 00:10:48.589 "read": true, 00:10:48.589 "write": true, 00:10:48.589 "unmap": true, 00:10:48.589 "flush": true, 00:10:48.589 "reset": true, 00:10:48.589 "nvme_admin": false, 00:10:48.589 "nvme_io": false, 00:10:48.589 "nvme_io_md": false, 00:10:48.589 "write_zeroes": true, 00:10:48.589 "zcopy": true, 00:10:48.589 "get_zone_info": false, 00:10:48.589 "zone_management": false, 00:10:48.589 "zone_append": false, 00:10:48.589 "compare": false, 00:10:48.589 "compare_and_write": false, 00:10:48.589 "abort": true, 00:10:48.589 "seek_hole": false, 00:10:48.589 "seek_data": false, 00:10:48.589 "copy": true, 00:10:48.589 "nvme_iov_md": false 00:10:48.589 }, 00:10:48.589 "memory_domains": [ 00:10:48.589 { 00:10:48.589 "dma_device_id": "system", 00:10:48.589 "dma_device_type": 1 00:10:48.589 }, 00:10:48.589 { 00:10:48.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.589 "dma_device_type": 2 00:10:48.590 } 00:10:48.590 ], 00:10:48.590 "driver_specific": {} 00:10:48.590 } 00:10:48.590 ] 00:10:48.590 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:48.590 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:48.590 20:24:40 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:48.590 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:48.590 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.590 20:24:40 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:48.590 20:24:40 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:10:48.590 20:24:40 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:48.590 Running I/O for 5 seconds... 00:10:49.527 20:24:41 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 1344237 00:10:49.527 20:24:41 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 1344237' 00:10:49.527 Process is existed as continue on error is set. Pid: 1344237 00:10:49.527 20:24:41 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:49.527 20:24:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:49.527 20:24:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:49.527 20:24:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:49.527 20:24:41 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:49.528 20:24:41 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:49.528 20:24:41 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:49.528 20:24:41 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:49.528 20:24:41 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:10:49.787 Timeout while waiting for response: 00:10:49.787 00:10:49.787 00:10:53.980 00:10:53.980 Latency(us) 00:10:53.980 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:53.980 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:53.980 EE_Dev_1 : 0.92 29337.98 114.60 5.43 0.00 540.97 170.96 865.50 00:10:53.980 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:53.980 Dev_2 : 5.00 63231.65 247.00 0.00 0.00 248.58 98.84 31229.33 00:10:53.980 =================================================================================================================== 00:10:53.980 Total : 92569.63 361.60 5.43 0.00 271.59 98.84 31229.33 00:10:54.598 20:24:46 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 1344237 00:10:54.598 20:24:46 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 1344237 ']' 00:10:54.598 20:24:46 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 1344237 00:10:54.598 20:24:46 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:10:54.598 20:24:46 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:54.598 20:24:46 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1344237 00:10:54.598 20:24:46 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:54.598 20:24:46 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:54.598 20:24:46 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1344237' 00:10:54.598 killing process with pid 1344237 00:10:54.598 20:24:46 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 1344237 00:10:54.598 Received shutdown signal, test time was about 5.000000 seconds 00:10:54.598 00:10:54.598 Latency(us) 00:10:54.598 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:54.598 =================================================================================================================== 00:10:54.598 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:54.598 20:24:46 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 1344237 00:10:55.164 20:24:47 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=1345136 00:10:55.164 20:24:47 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:55.164 20:24:47 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 1345136' 00:10:55.164 Process error testing pid: 1345136 00:10:55.164 20:24:47 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 1345136 00:10:55.164 20:24:47 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 1345136 ']' 00:10:55.164 20:24:47 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:55.164 20:24:47 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:55.164 20:24:47 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:55.164 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:55.164 20:24:47 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:55.164 20:24:47 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:55.164 [2024-07-15 20:24:47.349439] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:10:55.164 [2024-07-15 20:24:47.349516] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1345136 ] 00:10:55.164 [2024-07-15 20:24:47.485603] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:55.421 [2024-07-15 20:24:47.605788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:55.987 20:24:48 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:55.987 Dev_1 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:55.987 20:24:48 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:55.987 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:55.987 [ 00:10:55.987 { 00:10:55.987 "name": "Dev_1", 00:10:55.987 "aliases": [ 00:10:55.987 "0121d88a-6a81-4fe1-94ee-cfacafaee0e0" 00:10:55.987 ], 00:10:55.987 "product_name": "Malloc disk", 00:10:55.987 "block_size": 512, 00:10:55.987 "num_blocks": 262144, 00:10:55.987 "uuid": "0121d88a-6a81-4fe1-94ee-cfacafaee0e0", 00:10:55.987 "assigned_rate_limits": { 00:10:55.987 "rw_ios_per_sec": 0, 00:10:55.987 "rw_mbytes_per_sec": 0, 00:10:55.987 "r_mbytes_per_sec": 0, 00:10:55.987 "w_mbytes_per_sec": 0 00:10:55.987 }, 00:10:55.987 "claimed": false, 00:10:55.987 "zoned": false, 00:10:55.987 "supported_io_types": { 00:10:55.987 "read": true, 00:10:55.987 "write": true, 00:10:56.246 "unmap": true, 00:10:56.246 "flush": true, 00:10:56.246 "reset": true, 00:10:56.246 "nvme_admin": false, 00:10:56.246 "nvme_io": false, 00:10:56.246 "nvme_io_md": false, 00:10:56.246 "write_zeroes": true, 00:10:56.246 "zcopy": true, 00:10:56.246 "get_zone_info": false, 00:10:56.246 "zone_management": false, 00:10:56.246 "zone_append": false, 00:10:56.246 "compare": false, 00:10:56.246 "compare_and_write": false, 00:10:56.246 "abort": true, 00:10:56.246 "seek_hole": false, 00:10:56.246 "seek_data": false, 00:10:56.246 "copy": true, 00:10:56.246 "nvme_iov_md": false 00:10:56.246 }, 00:10:56.246 "memory_domains": [ 00:10:56.246 { 00:10:56.246 "dma_device_id": "system", 00:10:56.246 "dma_device_type": 1 00:10:56.246 }, 00:10:56.246 { 00:10:56.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:56.246 "dma_device_type": 2 00:10:56.246 } 00:10:56.246 ], 00:10:56.246 "driver_specific": {} 00:10:56.246 } 00:10:56.246 ] 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:56.246 20:24:48 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:56.246 true 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:56.246 20:24:48 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:56.246 Dev_2 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:56.246 20:24:48 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:56.246 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:56.246 [ 00:10:56.246 { 00:10:56.246 "name": "Dev_2", 00:10:56.246 "aliases": [ 00:10:56.246 "94d80eee-c2d0-4162-8bf5-fe2b4c3d91cf" 00:10:56.246 ], 00:10:56.246 "product_name": "Malloc disk", 00:10:56.246 "block_size": 512, 00:10:56.246 "num_blocks": 262144, 00:10:56.246 "uuid": "94d80eee-c2d0-4162-8bf5-fe2b4c3d91cf", 00:10:56.246 "assigned_rate_limits": { 00:10:56.246 "rw_ios_per_sec": 0, 00:10:56.246 "rw_mbytes_per_sec": 0, 00:10:56.246 "r_mbytes_per_sec": 0, 00:10:56.246 "w_mbytes_per_sec": 0 00:10:56.246 }, 00:10:56.246 "claimed": false, 00:10:56.246 "zoned": false, 00:10:56.246 "supported_io_types": { 00:10:56.246 "read": true, 00:10:56.246 "write": true, 00:10:56.246 "unmap": true, 00:10:56.246 "flush": true, 00:10:56.246 "reset": true, 00:10:56.246 "nvme_admin": false, 00:10:56.246 "nvme_io": false, 00:10:56.246 "nvme_io_md": false, 00:10:56.246 "write_zeroes": true, 00:10:56.246 "zcopy": true, 00:10:56.246 "get_zone_info": false, 00:10:56.246 "zone_management": false, 00:10:56.246 "zone_append": false, 00:10:56.246 "compare": false, 00:10:56.246 "compare_and_write": false, 00:10:56.246 "abort": true, 00:10:56.246 "seek_hole": false, 00:10:56.246 "seek_data": false, 00:10:56.246 "copy": true, 00:10:56.247 "nvme_iov_md": false 00:10:56.247 }, 00:10:56.247 "memory_domains": [ 00:10:56.247 { 00:10:56.247 "dma_device_id": "system", 00:10:56.247 "dma_device_type": 1 00:10:56.247 }, 00:10:56.247 { 00:10:56.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:56.247 "dma_device_type": 2 00:10:56.247 } 00:10:56.247 ], 00:10:56.247 "driver_specific": {} 00:10:56.247 } 00:10:56.247 ] 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:56.247 20:24:48 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:56.247 20:24:48 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 1345136 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:56.247 20:24:48 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 1345136 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:56.247 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 1345136 00:10:56.247 Running I/O for 5 seconds... 00:10:56.247 task offset: 59152 on job bdev=EE_Dev_1 fails 00:10:56.247 00:10:56.247 Latency(us) 00:10:56.247 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:56.247 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:56.247 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:56.247 EE_Dev_1 : 0.00 23605.15 92.21 5364.81 0.00 461.06 168.29 819.20 00:10:56.247 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:56.247 Dev_2 : 0.00 14420.91 56.33 0.00 0.00 828.61 170.96 1538.67 00:10:56.247 =================================================================================================================== 00:10:56.247 Total : 38026.06 148.54 5364.81 0.00 660.41 168.29 1538.67 00:10:56.247 [2024-07-15 20:24:48.594011] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:56.247 request: 00:10:56.247 { 00:10:56.247 "method": "perform_tests", 00:10:56.247 "req_id": 1 00:10:56.247 } 00:10:56.247 Got JSON-RPC error response 00:10:56.247 response: 00:10:56.247 { 00:10:56.247 "code": -32603, 00:10:56.247 "message": "bdevperf failed with error Operation not permitted" 00:10:56.247 } 00:10:56.815 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:56.815 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:56.815 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:56.815 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:56.815 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:56.815 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:56.815 00:10:56.815 real 0m9.326s 00:10:56.815 user 0m9.648s 00:10:56.815 sys 0m0.983s 00:10:56.815 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:56.815 20:24:48 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:56.815 ************************************ 00:10:56.815 END TEST bdev_error 00:10:56.815 ************************************ 00:10:56.815 20:24:49 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:56.815 20:24:49 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:56.815 20:24:49 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:56.815 20:24:49 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:56.815 20:24:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:56.815 ************************************ 00:10:56.815 START TEST bdev_stat 00:10:56.815 ************************************ 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=1345493 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 1345493' 00:10:56.815 Process Bdev IO statistics testing pid: 1345493 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 1345493 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 1345493 ']' 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:56.815 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:56.815 20:24:49 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:56.815 [2024-07-15 20:24:49.129701] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:10:56.815 [2024-07-15 20:24:49.129782] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1345493 ] 00:10:57.074 [2024-07-15 20:24:49.273833] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:57.074 [2024-07-15 20:24:49.381570] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:57.074 [2024-07-15 20:24:49.381574] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:58.010 Malloc_STAT 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:58.010 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:58.011 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:58.011 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:58.011 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:58.011 [ 00:10:58.011 { 00:10:58.011 "name": "Malloc_STAT", 00:10:58.011 "aliases": [ 00:10:58.011 "5fdaffc8-967d-437c-9566-99894f3ed17c" 00:10:58.011 ], 00:10:58.011 "product_name": "Malloc disk", 00:10:58.011 "block_size": 512, 00:10:58.011 "num_blocks": 262144, 00:10:58.011 "uuid": "5fdaffc8-967d-437c-9566-99894f3ed17c", 00:10:58.011 "assigned_rate_limits": { 00:10:58.011 "rw_ios_per_sec": 0, 00:10:58.011 "rw_mbytes_per_sec": 0, 00:10:58.011 "r_mbytes_per_sec": 0, 00:10:58.011 "w_mbytes_per_sec": 0 00:10:58.011 }, 00:10:58.011 "claimed": false, 00:10:58.011 "zoned": false, 00:10:58.011 "supported_io_types": { 00:10:58.011 "read": true, 00:10:58.011 "write": true, 00:10:58.011 "unmap": true, 00:10:58.011 "flush": true, 00:10:58.011 "reset": true, 00:10:58.011 "nvme_admin": false, 00:10:58.011 "nvme_io": false, 00:10:58.011 "nvme_io_md": false, 00:10:58.011 "write_zeroes": true, 00:10:58.011 "zcopy": true, 00:10:58.011 "get_zone_info": false, 00:10:58.011 "zone_management": false, 00:10:58.011 "zone_append": false, 00:10:58.011 "compare": false, 00:10:58.011 "compare_and_write": false, 00:10:58.011 "abort": true, 00:10:58.011 "seek_hole": false, 00:10:58.011 "seek_data": false, 00:10:58.011 "copy": true, 00:10:58.011 "nvme_iov_md": false 00:10:58.011 }, 00:10:58.011 "memory_domains": [ 00:10:58.011 { 00:10:58.011 "dma_device_id": "system", 00:10:58.011 "dma_device_type": 1 00:10:58.011 }, 00:10:58.011 { 00:10:58.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:58.011 "dma_device_type": 2 00:10:58.011 } 00:10:58.011 ], 00:10:58.011 "driver_specific": {} 00:10:58.011 } 00:10:58.011 ] 00:10:58.011 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:58.011 20:24:50 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:10:58.011 20:24:50 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:58.011 20:24:50 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:58.011 Running I/O for 10 seconds... 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:59.918 "tick_rate": 2300000000, 00:10:59.918 "ticks": 5416899032801824, 00:10:59.918 "bdevs": [ 00:10:59.918 { 00:10:59.918 "name": "Malloc_STAT", 00:10:59.918 "bytes_read": 710980096, 00:10:59.918 "num_read_ops": 173572, 00:10:59.918 "bytes_written": 0, 00:10:59.918 "num_write_ops": 0, 00:10:59.918 "bytes_unmapped": 0, 00:10:59.918 "num_unmap_ops": 0, 00:10:59.918 "bytes_copied": 0, 00:10:59.918 "num_copy_ops": 0, 00:10:59.918 "read_latency_ticks": 2229788269416, 00:10:59.918 "max_read_latency_ticks": 17664914, 00:10:59.918 "min_read_latency_ticks": 260990, 00:10:59.918 "write_latency_ticks": 0, 00:10:59.918 "max_write_latency_ticks": 0, 00:10:59.918 "min_write_latency_ticks": 0, 00:10:59.918 "unmap_latency_ticks": 0, 00:10:59.918 "max_unmap_latency_ticks": 0, 00:10:59.918 "min_unmap_latency_ticks": 0, 00:10:59.918 "copy_latency_ticks": 0, 00:10:59.918 "max_copy_latency_ticks": 0, 00:10:59.918 "min_copy_latency_ticks": 0, 00:10:59.918 "io_error": {} 00:10:59.918 } 00:10:59.918 ] 00:10:59.918 }' 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=173572 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:59.918 "tick_rate": 2300000000, 00:10:59.918 "ticks": 5416899195455350, 00:10:59.918 "name": "Malloc_STAT", 00:10:59.918 "channels": [ 00:10:59.918 { 00:10:59.918 "thread_id": 2, 00:10:59.918 "bytes_read": 408944640, 00:10:59.918 "num_read_ops": 99840, 00:10:59.918 "bytes_written": 0, 00:10:59.918 "num_write_ops": 0, 00:10:59.918 "bytes_unmapped": 0, 00:10:59.918 "num_unmap_ops": 0, 00:10:59.918 "bytes_copied": 0, 00:10:59.918 "num_copy_ops": 0, 00:10:59.918 "read_latency_ticks": 1155542184304, 00:10:59.918 "max_read_latency_ticks": 12499040, 00:10:59.918 "min_read_latency_ticks": 8224408, 00:10:59.918 "write_latency_ticks": 0, 00:10:59.918 "max_write_latency_ticks": 0, 00:10:59.918 "min_write_latency_ticks": 0, 00:10:59.918 "unmap_latency_ticks": 0, 00:10:59.918 "max_unmap_latency_ticks": 0, 00:10:59.918 "min_unmap_latency_ticks": 0, 00:10:59.918 "copy_latency_ticks": 0, 00:10:59.918 "max_copy_latency_ticks": 0, 00:10:59.918 "min_copy_latency_ticks": 0 00:10:59.918 }, 00:10:59.918 { 00:10:59.918 "thread_id": 3, 00:10:59.918 "bytes_read": 328204288, 00:10:59.918 "num_read_ops": 80128, 00:10:59.918 "bytes_written": 0, 00:10:59.918 "num_write_ops": 0, 00:10:59.918 "bytes_unmapped": 0, 00:10:59.918 "num_unmap_ops": 0, 00:10:59.918 "bytes_copied": 0, 00:10:59.918 "num_copy_ops": 0, 00:10:59.918 "read_latency_ticks": 1156592733248, 00:10:59.918 "max_read_latency_ticks": 17664914, 00:10:59.918 "min_read_latency_ticks": 9336232, 00:10:59.918 "write_latency_ticks": 0, 00:10:59.918 "max_write_latency_ticks": 0, 00:10:59.918 "min_write_latency_ticks": 0, 00:10:59.918 "unmap_latency_ticks": 0, 00:10:59.918 "max_unmap_latency_ticks": 0, 00:10:59.918 "min_unmap_latency_ticks": 0, 00:10:59.918 "copy_latency_ticks": 0, 00:10:59.918 "max_copy_latency_ticks": 0, 00:10:59.918 "min_copy_latency_ticks": 0 00:10:59.918 } 00:10:59.918 ] 00:10:59.918 }' 00:10:59.918 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=99840 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=99840 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=80128 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=179968 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:11:00.179 "tick_rate": 2300000000, 00:11:00.179 "ticks": 5416899461956438, 00:11:00.179 "bdevs": [ 00:11:00.179 { 00:11:00.179 "name": "Malloc_STAT", 00:11:00.179 "bytes_read": 780186112, 00:11:00.179 "num_read_ops": 190468, 00:11:00.179 "bytes_written": 0, 00:11:00.179 "num_write_ops": 0, 00:11:00.179 "bytes_unmapped": 0, 00:11:00.179 "num_unmap_ops": 0, 00:11:00.179 "bytes_copied": 0, 00:11:00.179 "num_copy_ops": 0, 00:11:00.179 "read_latency_ticks": 2447002628702, 00:11:00.179 "max_read_latency_ticks": 17664914, 00:11:00.179 "min_read_latency_ticks": 260990, 00:11:00.179 "write_latency_ticks": 0, 00:11:00.179 "max_write_latency_ticks": 0, 00:11:00.179 "min_write_latency_ticks": 0, 00:11:00.179 "unmap_latency_ticks": 0, 00:11:00.179 "max_unmap_latency_ticks": 0, 00:11:00.179 "min_unmap_latency_ticks": 0, 00:11:00.179 "copy_latency_ticks": 0, 00:11:00.179 "max_copy_latency_ticks": 0, 00:11:00.179 "min_copy_latency_ticks": 0, 00:11:00.179 "io_error": {} 00:11:00.179 } 00:11:00.179 ] 00:11:00.179 }' 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=190468 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 179968 -lt 173572 ']' 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 179968 -gt 190468 ']' 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:00.179 00:11:00.179 Latency(us) 00:11:00.179 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:00.179 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:00.179 Malloc_STAT : 2.16 50796.36 198.42 0.00 0.00 5027.88 1389.08 5442.34 00:11:00.179 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:00.179 Malloc_STAT : 2.16 40752.90 159.19 0.00 0.00 6264.76 1510.18 7693.36 00:11:00.179 =================================================================================================================== 00:11:00.179 Total : 91549.26 357.61 0.00 0.00 5579.03 1389.08 7693.36 00:11:00.179 0 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 1345493 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 1345493 ']' 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 1345493 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1345493 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1345493' 00:11:00.179 killing process with pid 1345493 00:11:00.179 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 1345493 00:11:00.179 Received shutdown signal, test time was about 2.238196 seconds 00:11:00.179 00:11:00.180 Latency(us) 00:11:00.180 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:00.180 =================================================================================================================== 00:11:00.180 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:00.180 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 1345493 00:11:00.439 20:24:52 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:11:00.439 00:11:00.439 real 0m3.664s 00:11:00.439 user 0m7.314s 00:11:00.439 sys 0m0.471s 00:11:00.439 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:00.439 20:24:52 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:00.439 ************************************ 00:11:00.439 END TEST bdev_stat 00:11:00.439 ************************************ 00:11:00.439 20:24:52 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:00.439 20:24:52 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:11:00.439 20:24:52 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:11:00.439 20:24:52 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:11:00.439 20:24:52 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:11:00.439 20:24:52 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:11:00.439 20:24:52 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:11:00.439 20:24:52 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:11:00.439 20:24:52 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:11:00.439 20:24:52 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:11:00.439 20:24:52 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:11:00.439 00:11:00.439 real 2m1.391s 00:11:00.439 user 7m18.639s 00:11:00.439 sys 0m24.977s 00:11:00.439 20:24:52 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:00.439 20:24:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:00.439 ************************************ 00:11:00.439 END TEST blockdev_general 00:11:00.439 ************************************ 00:11:00.699 20:24:52 -- common/autotest_common.sh@1142 -- # return 0 00:11:00.699 20:24:52 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:11:00.699 20:24:52 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:00.699 20:24:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:00.699 20:24:52 -- common/autotest_common.sh@10 -- # set +x 00:11:00.699 ************************************ 00:11:00.699 START TEST bdev_raid 00:11:00.699 ************************************ 00:11:00.699 20:24:52 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:11:00.699 * Looking for test storage... 00:11:00.699 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:11:00.699 20:24:52 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:11:00.699 20:24:52 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:11:00.699 20:24:52 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:11:00.699 20:24:52 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:11:00.699 20:24:52 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:11:00.699 20:24:52 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:11:00.699 20:24:52 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:11:00.699 20:24:53 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:11:00.699 20:24:53 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:11:00.699 20:24:53 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:11:00.699 20:24:53 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:11:00.699 20:24:53 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:11:00.699 20:24:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:00.699 20:24:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:00.699 20:24:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:00.699 ************************************ 00:11:00.699 START TEST raid_function_test_raid0 00:11:00.699 ************************************ 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=1346084 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1346084' 00:11:00.699 Process raid pid: 1346084 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 1346084 /var/tmp/spdk-raid.sock 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 1346084 ']' 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:00.699 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:00.699 20:24:53 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:00.958 [2024-07-15 20:24:53.119540] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:11:00.958 [2024-07-15 20:24:53.119609] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:00.958 [2024-07-15 20:24:53.251216] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:01.216 [2024-07-15 20:24:53.358285] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:01.216 [2024-07-15 20:24:53.421557] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:01.216 [2024-07-15 20:24:53.421586] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:01.781 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:01.781 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:11:01.781 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:11:01.781 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:11:01.781 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:01.781 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:11:01.781 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:02.038 [2024-07-15 20:24:54.354953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:02.038 [2024-07-15 20:24:54.356412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:02.038 [2024-07-15 20:24:54.356470] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16aebd0 00:11:02.038 [2024-07-15 20:24:54.356480] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:02.038 [2024-07-15 20:24:54.356669] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16aeb10 00:11:02.038 [2024-07-15 20:24:54.356791] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16aebd0 00:11:02.039 [2024-07-15 20:24:54.356801] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x16aebd0 00:11:02.039 [2024-07-15 20:24:54.356902] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:02.039 Base_1 00:11:02.039 Base_2 00:11:02.039 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:02.039 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:02.039 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:02.297 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:02.556 [2024-07-15 20:24:54.860320] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18628e0 00:11:02.556 /dev/nbd0 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:02.556 1+0 records in 00:11:02.556 1+0 records out 00:11:02.556 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026052 s, 15.7 MB/s 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:02.556 20:24:54 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:02.813 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:02.814 { 00:11:02.814 "nbd_device": "/dev/nbd0", 00:11:02.814 "bdev_name": "raid" 00:11:02.814 } 00:11:02.814 ]' 00:11:02.814 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:02.814 { 00:11:02.814 "nbd_device": "/dev/nbd0", 00:11:02.814 "bdev_name": "raid" 00:11:02.814 } 00:11:02.814 ]' 00:11:02.814 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:03.071 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:03.072 4096+0 records in 00:11:03.072 4096+0 records out 00:11:03.072 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.030175 s, 69.5 MB/s 00:11:03.072 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:03.330 4096+0 records in 00:11:03.330 4096+0 records out 00:11:03.330 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.214811 s, 9.8 MB/s 00:11:03.330 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:03.330 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:03.330 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:03.331 128+0 records in 00:11:03.331 128+0 records out 00:11:03.331 65536 bytes (66 kB, 64 KiB) copied, 0.000831983 s, 78.8 MB/s 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:03.331 2035+0 records in 00:11:03.331 2035+0 records out 00:11:03.331 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00632769 s, 165 MB/s 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:03.331 456+0 records in 00:11:03.331 456+0 records out 00:11:03.331 233472 bytes (233 kB, 228 KiB) copied, 0.00153059 s, 153 MB/s 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:03.331 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:03.590 [2024-07-15 20:24:55.845680] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:03.590 20:24:55 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 1346084 00:11:03.849 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 1346084 ']' 00:11:03.850 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 1346084 00:11:03.850 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:11:03.850 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:03.850 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1346084 00:11:04.109 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:04.109 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:04.109 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1346084' 00:11:04.109 killing process with pid 1346084 00:11:04.109 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 1346084 00:11:04.109 [2024-07-15 20:24:56.231051] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:04.109 [2024-07-15 20:24:56.231119] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:04.109 [2024-07-15 20:24:56.231163] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:04.109 [2024-07-15 20:24:56.231179] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16aebd0 name raid, state offline 00:11:04.109 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 1346084 00:11:04.109 [2024-07-15 20:24:56.248212] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:04.109 20:24:56 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:11:04.109 00:11:04.109 real 0m3.403s 00:11:04.109 user 0m4.557s 00:11:04.109 sys 0m1.278s 00:11:04.109 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:04.109 20:24:56 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:04.109 ************************************ 00:11:04.109 END TEST raid_function_test_raid0 00:11:04.109 ************************************ 00:11:04.368 20:24:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:04.368 20:24:56 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:11:04.368 20:24:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:04.368 20:24:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:04.368 20:24:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:04.368 ************************************ 00:11:04.368 START TEST raid_function_test_concat 00:11:04.368 ************************************ 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=1346565 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1346565' 00:11:04.368 Process raid pid: 1346565 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 1346565 /var/tmp/spdk-raid.sock 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 1346565 ']' 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:04.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:04.368 20:24:56 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:04.368 [2024-07-15 20:24:56.604726] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:11:04.368 [2024-07-15 20:24:56.604802] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:04.368 [2024-07-15 20:24:56.736117] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:04.627 [2024-07-15 20:24:56.835559] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:04.627 [2024-07-15 20:24:56.902556] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:04.627 [2024-07-15 20:24:56.902595] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:05.250 20:24:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:05.250 20:24:57 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:11:05.250 20:24:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:11:05.250 20:24:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:11:05.250 20:24:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:05.250 20:24:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:11:05.250 20:24:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:05.509 [2024-07-15 20:24:57.793030] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:05.509 [2024-07-15 20:24:57.794488] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:05.509 [2024-07-15 20:24:57.794546] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x238abd0 00:11:05.509 [2024-07-15 20:24:57.794556] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:05.509 [2024-07-15 20:24:57.794744] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x238ab10 00:11:05.509 [2024-07-15 20:24:57.794863] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x238abd0 00:11:05.509 [2024-07-15 20:24:57.794873] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x238abd0 00:11:05.509 [2024-07-15 20:24:57.795003] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:05.509 Base_1 00:11:05.509 Base_2 00:11:05.509 20:24:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:05.509 20:24:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:05.509 20:24:57 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:05.768 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:06.028 [2024-07-15 20:24:58.322394] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x253e8e0 00:11:06.028 /dev/nbd0 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:06.028 1+0 records in 00:11:06.028 1+0 records out 00:11:06.028 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256989 s, 15.9 MB/s 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:06.028 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:06.287 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:06.287 { 00:11:06.287 "nbd_device": "/dev/nbd0", 00:11:06.287 "bdev_name": "raid" 00:11:06.287 } 00:11:06.287 ]' 00:11:06.287 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:06.287 { 00:11:06.287 "nbd_device": "/dev/nbd0", 00:11:06.287 "bdev_name": "raid" 00:11:06.287 } 00:11:06.287 ]' 00:11:06.287 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:06.546 4096+0 records in 00:11:06.546 4096+0 records out 00:11:06.546 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0309861 s, 67.7 MB/s 00:11:06.546 20:24:58 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:06.806 4096+0 records in 00:11:06.806 4096+0 records out 00:11:06.806 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.321167 s, 6.5 MB/s 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:06.806 128+0 records in 00:11:06.806 128+0 records out 00:11:06.806 65536 bytes (66 kB, 64 KiB) copied, 0.000828824 s, 79.1 MB/s 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:06.806 2035+0 records in 00:11:06.806 2035+0 records out 00:11:06.806 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0111024 s, 93.8 MB/s 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:06.806 456+0 records in 00:11:06.806 456+0 records out 00:11:06.806 233472 bytes (233 kB, 228 KiB) copied, 0.00270791 s, 86.2 MB/s 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:06.806 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:07.375 [2024-07-15 20:24:59.447811] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 1346565 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 1346565 ']' 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 1346565 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:07.375 20:24:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1346565 00:11:07.634 20:24:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:07.634 20:24:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:07.634 20:24:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1346565' 00:11:07.634 killing process with pid 1346565 00:11:07.634 20:24:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 1346565 00:11:07.634 [2024-07-15 20:24:59.781410] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:07.634 [2024-07-15 20:24:59.781480] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:07.634 [2024-07-15 20:24:59.781523] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:07.634 [2024-07-15 20:24:59.781538] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x238abd0 name raid, state offline 00:11:07.634 20:24:59 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 1346565 00:11:07.634 [2024-07-15 20:24:59.798898] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:07.893 20:25:00 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:11:07.893 00:11:07.893 real 0m3.481s 00:11:07.893 user 0m4.566s 00:11:07.893 sys 0m1.291s 00:11:07.893 20:25:00 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:07.893 20:25:00 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:07.893 ************************************ 00:11:07.893 END TEST raid_function_test_concat 00:11:07.893 ************************************ 00:11:07.893 20:25:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:07.893 20:25:00 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:11:07.893 20:25:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:07.893 20:25:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:07.893 20:25:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:07.893 ************************************ 00:11:07.893 START TEST raid0_resize_test 00:11:07.893 ************************************ 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=1347166 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 1347166' 00:11:07.893 Process raid pid: 1347166 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 1347166 /var/tmp/spdk-raid.sock 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 1347166 ']' 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:07.893 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:07.893 20:25:00 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:07.893 [2024-07-15 20:25:00.174711] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:11:07.893 [2024-07-15 20:25:00.174775] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:08.152 [2024-07-15 20:25:00.304395] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:08.152 [2024-07-15 20:25:00.409995] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:08.152 [2024-07-15 20:25:00.475091] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:08.152 [2024-07-15 20:25:00.475129] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:09.088 20:25:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:09.088 20:25:01 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:11:09.088 20:25:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:11:09.346 Base_1 00:11:09.346 20:25:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:11:09.605 Base_2 00:11:09.605 20:25:01 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:11:10.173 [2024-07-15 20:25:02.365135] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:10.173 [2024-07-15 20:25:02.366547] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:10.173 [2024-07-15 20:25:02.366598] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcfe780 00:11:10.173 [2024-07-15 20:25:02.366608] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:10.173 [2024-07-15 20:25:02.366829] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x84a020 00:11:10.173 [2024-07-15 20:25:02.366937] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcfe780 00:11:10.173 [2024-07-15 20:25:02.366947] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0xcfe780 00:11:10.173 [2024-07-15 20:25:02.367061] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:10.173 20:25:02 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:11:10.739 [2024-07-15 20:25:02.878471] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:10.739 [2024-07-15 20:25:02.878494] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:11:10.739 true 00:11:10.739 20:25:02 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:10.739 20:25:02 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:11:10.997 [2024-07-15 20:25:03.135295] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:10.997 20:25:03 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:11:10.997 20:25:03 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:11:10.997 20:25:03 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:11:10.997 20:25:03 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:11:11.255 [2024-07-15 20:25:03.632606] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:11.255 [2024-07-15 20:25:03.632638] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:11:11.255 [2024-07-15 20:25:03.632666] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:11:11.514 true 00:11:11.514 20:25:03 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:11.514 20:25:03 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:11:11.514 [2024-07-15 20:25:03.893446] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 1347166 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 1347166 ']' 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 1347166 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1347166 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1347166' 00:11:11.773 killing process with pid 1347166 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 1347166 00:11:11.773 [2024-07-15 20:25:03.963020] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:11.773 [2024-07-15 20:25:03.963076] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:11.773 [2024-07-15 20:25:03.963116] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:11.773 [2024-07-15 20:25:03.963127] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcfe780 name Raid, state offline 00:11:11.773 20:25:03 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 1347166 00:11:11.773 [2024-07-15 20:25:03.964518] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:12.033 20:25:04 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:11:12.033 00:11:12.033 real 0m4.057s 00:11:12.033 user 0m6.600s 00:11:12.033 sys 0m0.785s 00:11:12.033 20:25:04 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:12.033 20:25:04 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:12.033 ************************************ 00:11:12.033 END TEST raid0_resize_test 00:11:12.033 ************************************ 00:11:12.033 20:25:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:12.033 20:25:04 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:11:12.033 20:25:04 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:12.033 20:25:04 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:11:12.033 20:25:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:12.033 20:25:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:12.033 20:25:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:12.033 ************************************ 00:11:12.033 START TEST raid_state_function_test 00:11:12.033 ************************************ 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1347730 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1347730' 00:11:12.033 Process raid pid: 1347730 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1347730 /var/tmp/spdk-raid.sock 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1347730 ']' 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:12.033 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:12.033 20:25:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:12.033 [2024-07-15 20:25:04.316082] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:11:12.033 [2024-07-15 20:25:04.316147] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:12.292 [2024-07-15 20:25:04.434571] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:12.292 [2024-07-15 20:25:04.534659] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:12.292 [2024-07-15 20:25:04.600728] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:12.292 [2024-07-15 20:25:04.600766] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:13.226 20:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:13.226 20:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:13.226 20:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:13.790 [2024-07-15 20:25:06.045420] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:13.790 [2024-07-15 20:25:06.045464] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:13.790 [2024-07-15 20:25:06.045475] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:13.790 [2024-07-15 20:25:06.045487] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.790 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:14.047 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:14.047 "name": "Existed_Raid", 00:11:14.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:14.047 "strip_size_kb": 64, 00:11:14.047 "state": "configuring", 00:11:14.047 "raid_level": "raid0", 00:11:14.047 "superblock": false, 00:11:14.047 "num_base_bdevs": 2, 00:11:14.047 "num_base_bdevs_discovered": 0, 00:11:14.047 "num_base_bdevs_operational": 2, 00:11:14.047 "base_bdevs_list": [ 00:11:14.047 { 00:11:14.047 "name": "BaseBdev1", 00:11:14.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:14.047 "is_configured": false, 00:11:14.047 "data_offset": 0, 00:11:14.047 "data_size": 0 00:11:14.047 }, 00:11:14.047 { 00:11:14.047 "name": "BaseBdev2", 00:11:14.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:14.047 "is_configured": false, 00:11:14.047 "data_offset": 0, 00:11:14.047 "data_size": 0 00:11:14.047 } 00:11:14.047 ] 00:11:14.047 }' 00:11:14.047 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:14.047 20:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.612 20:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:14.870 [2024-07-15 20:25:07.140190] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:14.870 [2024-07-15 20:25:07.140221] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7b2a80 name Existed_Raid, state configuring 00:11:14.870 20:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:15.128 [2024-07-15 20:25:07.400883] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:15.128 [2024-07-15 20:25:07.400918] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:15.128 [2024-07-15 20:25:07.400933] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:15.128 [2024-07-15 20:25:07.400949] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:15.128 20:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:15.386 [2024-07-15 20:25:07.663506] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:15.386 BaseBdev1 00:11:15.386 20:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:15.386 20:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:15.386 20:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:15.386 20:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:15.386 20:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:15.386 20:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:15.386 20:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:15.644 20:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:15.902 [ 00:11:15.902 { 00:11:15.902 "name": "BaseBdev1", 00:11:15.902 "aliases": [ 00:11:15.902 "6e97e19f-464e-492c-ac2b-c631ebfbb2db" 00:11:15.902 ], 00:11:15.902 "product_name": "Malloc disk", 00:11:15.902 "block_size": 512, 00:11:15.902 "num_blocks": 65536, 00:11:15.902 "uuid": "6e97e19f-464e-492c-ac2b-c631ebfbb2db", 00:11:15.902 "assigned_rate_limits": { 00:11:15.902 "rw_ios_per_sec": 0, 00:11:15.902 "rw_mbytes_per_sec": 0, 00:11:15.902 "r_mbytes_per_sec": 0, 00:11:15.902 "w_mbytes_per_sec": 0 00:11:15.902 }, 00:11:15.902 "claimed": true, 00:11:15.902 "claim_type": "exclusive_write", 00:11:15.902 "zoned": false, 00:11:15.902 "supported_io_types": { 00:11:15.902 "read": true, 00:11:15.902 "write": true, 00:11:15.902 "unmap": true, 00:11:15.902 "flush": true, 00:11:15.902 "reset": true, 00:11:15.902 "nvme_admin": false, 00:11:15.902 "nvme_io": false, 00:11:15.902 "nvme_io_md": false, 00:11:15.902 "write_zeroes": true, 00:11:15.902 "zcopy": true, 00:11:15.902 "get_zone_info": false, 00:11:15.902 "zone_management": false, 00:11:15.902 "zone_append": false, 00:11:15.902 "compare": false, 00:11:15.902 "compare_and_write": false, 00:11:15.902 "abort": true, 00:11:15.902 "seek_hole": false, 00:11:15.902 "seek_data": false, 00:11:15.902 "copy": true, 00:11:15.902 "nvme_iov_md": false 00:11:15.902 }, 00:11:15.902 "memory_domains": [ 00:11:15.902 { 00:11:15.902 "dma_device_id": "system", 00:11:15.902 "dma_device_type": 1 00:11:15.902 }, 00:11:15.902 { 00:11:15.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:15.902 "dma_device_type": 2 00:11:15.902 } 00:11:15.902 ], 00:11:15.902 "driver_specific": {} 00:11:15.902 } 00:11:15.902 ] 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.902 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:16.161 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:16.161 "name": "Existed_Raid", 00:11:16.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:16.161 "strip_size_kb": 64, 00:11:16.161 "state": "configuring", 00:11:16.161 "raid_level": "raid0", 00:11:16.161 "superblock": false, 00:11:16.161 "num_base_bdevs": 2, 00:11:16.161 "num_base_bdevs_discovered": 1, 00:11:16.161 "num_base_bdevs_operational": 2, 00:11:16.161 "base_bdevs_list": [ 00:11:16.161 { 00:11:16.161 "name": "BaseBdev1", 00:11:16.161 "uuid": "6e97e19f-464e-492c-ac2b-c631ebfbb2db", 00:11:16.161 "is_configured": true, 00:11:16.161 "data_offset": 0, 00:11:16.161 "data_size": 65536 00:11:16.161 }, 00:11:16.161 { 00:11:16.161 "name": "BaseBdev2", 00:11:16.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:16.161 "is_configured": false, 00:11:16.161 "data_offset": 0, 00:11:16.161 "data_size": 0 00:11:16.161 } 00:11:16.161 ] 00:11:16.161 }' 00:11:16.161 20:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:16.161 20:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:16.728 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:16.987 [2024-07-15 20:25:09.255710] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:16.987 [2024-07-15 20:25:09.255751] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7b2350 name Existed_Raid, state configuring 00:11:16.987 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:17.246 [2024-07-15 20:25:09.488364] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:17.246 [2024-07-15 20:25:09.489914] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:17.246 [2024-07-15 20:25:09.489951] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:17.246 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:17.246 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:17.246 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:17.246 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:17.247 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:17.247 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:17.247 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:17.247 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:17.247 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.247 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.247 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.247 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.247 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.247 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:17.505 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.505 "name": "Existed_Raid", 00:11:17.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.505 "strip_size_kb": 64, 00:11:17.505 "state": "configuring", 00:11:17.505 "raid_level": "raid0", 00:11:17.505 "superblock": false, 00:11:17.505 "num_base_bdevs": 2, 00:11:17.505 "num_base_bdevs_discovered": 1, 00:11:17.505 "num_base_bdevs_operational": 2, 00:11:17.505 "base_bdevs_list": [ 00:11:17.505 { 00:11:17.505 "name": "BaseBdev1", 00:11:17.505 "uuid": "6e97e19f-464e-492c-ac2b-c631ebfbb2db", 00:11:17.505 "is_configured": true, 00:11:17.505 "data_offset": 0, 00:11:17.505 "data_size": 65536 00:11:17.505 }, 00:11:17.505 { 00:11:17.505 "name": "BaseBdev2", 00:11:17.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.505 "is_configured": false, 00:11:17.505 "data_offset": 0, 00:11:17.505 "data_size": 0 00:11:17.505 } 00:11:17.506 ] 00:11:17.506 }' 00:11:17.506 20:25:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.506 20:25:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:18.073 20:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:18.332 [2024-07-15 20:25:10.614794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:18.332 [2024-07-15 20:25:10.614830] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x7b3000 00:11:18.332 [2024-07-15 20:25:10.614839] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:18.332 [2024-07-15 20:25:10.615031] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x6cd0c0 00:11:18.332 [2024-07-15 20:25:10.615155] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7b3000 00:11:18.332 [2024-07-15 20:25:10.615165] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x7b3000 00:11:18.332 [2024-07-15 20:25:10.615326] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:18.332 BaseBdev2 00:11:18.332 20:25:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:18.332 20:25:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:18.332 20:25:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:18.332 20:25:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:18.332 20:25:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:18.332 20:25:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:18.332 20:25:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:18.590 20:25:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:18.849 [ 00:11:18.849 { 00:11:18.849 "name": "BaseBdev2", 00:11:18.849 "aliases": [ 00:11:18.849 "88b81ecd-6e59-451c-9f18-a28473097391" 00:11:18.849 ], 00:11:18.849 "product_name": "Malloc disk", 00:11:18.849 "block_size": 512, 00:11:18.849 "num_blocks": 65536, 00:11:18.849 "uuid": "88b81ecd-6e59-451c-9f18-a28473097391", 00:11:18.849 "assigned_rate_limits": { 00:11:18.849 "rw_ios_per_sec": 0, 00:11:18.849 "rw_mbytes_per_sec": 0, 00:11:18.849 "r_mbytes_per_sec": 0, 00:11:18.849 "w_mbytes_per_sec": 0 00:11:18.849 }, 00:11:18.849 "claimed": true, 00:11:18.849 "claim_type": "exclusive_write", 00:11:18.849 "zoned": false, 00:11:18.849 "supported_io_types": { 00:11:18.849 "read": true, 00:11:18.849 "write": true, 00:11:18.849 "unmap": true, 00:11:18.849 "flush": true, 00:11:18.849 "reset": true, 00:11:18.849 "nvme_admin": false, 00:11:18.849 "nvme_io": false, 00:11:18.849 "nvme_io_md": false, 00:11:18.849 "write_zeroes": true, 00:11:18.849 "zcopy": true, 00:11:18.849 "get_zone_info": false, 00:11:18.849 "zone_management": false, 00:11:18.849 "zone_append": false, 00:11:18.849 "compare": false, 00:11:18.849 "compare_and_write": false, 00:11:18.849 "abort": true, 00:11:18.849 "seek_hole": false, 00:11:18.849 "seek_data": false, 00:11:18.849 "copy": true, 00:11:18.849 "nvme_iov_md": false 00:11:18.849 }, 00:11:18.849 "memory_domains": [ 00:11:18.849 { 00:11:18.849 "dma_device_id": "system", 00:11:18.849 "dma_device_type": 1 00:11:18.849 }, 00:11:18.849 { 00:11:18.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.849 "dma_device_type": 2 00:11:18.849 } 00:11:18.849 ], 00:11:18.849 "driver_specific": {} 00:11:18.849 } 00:11:18.849 ] 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.849 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:19.108 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:19.108 "name": "Existed_Raid", 00:11:19.108 "uuid": "5d3c6ec8-21e5-48a4-96ae-cb091c9e1966", 00:11:19.108 "strip_size_kb": 64, 00:11:19.108 "state": "online", 00:11:19.108 "raid_level": "raid0", 00:11:19.108 "superblock": false, 00:11:19.108 "num_base_bdevs": 2, 00:11:19.108 "num_base_bdevs_discovered": 2, 00:11:19.108 "num_base_bdevs_operational": 2, 00:11:19.108 "base_bdevs_list": [ 00:11:19.108 { 00:11:19.108 "name": "BaseBdev1", 00:11:19.108 "uuid": "6e97e19f-464e-492c-ac2b-c631ebfbb2db", 00:11:19.108 "is_configured": true, 00:11:19.108 "data_offset": 0, 00:11:19.108 "data_size": 65536 00:11:19.108 }, 00:11:19.108 { 00:11:19.108 "name": "BaseBdev2", 00:11:19.108 "uuid": "88b81ecd-6e59-451c-9f18-a28473097391", 00:11:19.108 "is_configured": true, 00:11:19.108 "data_offset": 0, 00:11:19.108 "data_size": 65536 00:11:19.108 } 00:11:19.108 ] 00:11:19.108 }' 00:11:19.108 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:19.108 20:25:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.698 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:19.698 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:19.698 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:19.698 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:19.698 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:19.698 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:19.698 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:19.698 20:25:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:19.957 [2024-07-15 20:25:12.203281] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:19.957 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:19.957 "name": "Existed_Raid", 00:11:19.957 "aliases": [ 00:11:19.957 "5d3c6ec8-21e5-48a4-96ae-cb091c9e1966" 00:11:19.957 ], 00:11:19.957 "product_name": "Raid Volume", 00:11:19.957 "block_size": 512, 00:11:19.957 "num_blocks": 131072, 00:11:19.957 "uuid": "5d3c6ec8-21e5-48a4-96ae-cb091c9e1966", 00:11:19.957 "assigned_rate_limits": { 00:11:19.957 "rw_ios_per_sec": 0, 00:11:19.957 "rw_mbytes_per_sec": 0, 00:11:19.957 "r_mbytes_per_sec": 0, 00:11:19.957 "w_mbytes_per_sec": 0 00:11:19.957 }, 00:11:19.957 "claimed": false, 00:11:19.957 "zoned": false, 00:11:19.957 "supported_io_types": { 00:11:19.957 "read": true, 00:11:19.957 "write": true, 00:11:19.957 "unmap": true, 00:11:19.957 "flush": true, 00:11:19.957 "reset": true, 00:11:19.957 "nvme_admin": false, 00:11:19.957 "nvme_io": false, 00:11:19.957 "nvme_io_md": false, 00:11:19.957 "write_zeroes": true, 00:11:19.957 "zcopy": false, 00:11:19.957 "get_zone_info": false, 00:11:19.957 "zone_management": false, 00:11:19.957 "zone_append": false, 00:11:19.957 "compare": false, 00:11:19.957 "compare_and_write": false, 00:11:19.957 "abort": false, 00:11:19.957 "seek_hole": false, 00:11:19.957 "seek_data": false, 00:11:19.957 "copy": false, 00:11:19.957 "nvme_iov_md": false 00:11:19.957 }, 00:11:19.957 "memory_domains": [ 00:11:19.957 { 00:11:19.957 "dma_device_id": "system", 00:11:19.957 "dma_device_type": 1 00:11:19.957 }, 00:11:19.957 { 00:11:19.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:19.957 "dma_device_type": 2 00:11:19.957 }, 00:11:19.957 { 00:11:19.957 "dma_device_id": "system", 00:11:19.957 "dma_device_type": 1 00:11:19.957 }, 00:11:19.957 { 00:11:19.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:19.957 "dma_device_type": 2 00:11:19.957 } 00:11:19.957 ], 00:11:19.957 "driver_specific": { 00:11:19.957 "raid": { 00:11:19.957 "uuid": "5d3c6ec8-21e5-48a4-96ae-cb091c9e1966", 00:11:19.957 "strip_size_kb": 64, 00:11:19.957 "state": "online", 00:11:19.957 "raid_level": "raid0", 00:11:19.957 "superblock": false, 00:11:19.957 "num_base_bdevs": 2, 00:11:19.957 "num_base_bdevs_discovered": 2, 00:11:19.957 "num_base_bdevs_operational": 2, 00:11:19.957 "base_bdevs_list": [ 00:11:19.957 { 00:11:19.957 "name": "BaseBdev1", 00:11:19.957 "uuid": "6e97e19f-464e-492c-ac2b-c631ebfbb2db", 00:11:19.957 "is_configured": true, 00:11:19.957 "data_offset": 0, 00:11:19.957 "data_size": 65536 00:11:19.957 }, 00:11:19.957 { 00:11:19.957 "name": "BaseBdev2", 00:11:19.957 "uuid": "88b81ecd-6e59-451c-9f18-a28473097391", 00:11:19.957 "is_configured": true, 00:11:19.957 "data_offset": 0, 00:11:19.957 "data_size": 65536 00:11:19.957 } 00:11:19.957 ] 00:11:19.957 } 00:11:19.957 } 00:11:19.957 }' 00:11:19.957 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:19.957 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:19.957 BaseBdev2' 00:11:19.957 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:19.957 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:19.957 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:20.215 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:20.215 "name": "BaseBdev1", 00:11:20.215 "aliases": [ 00:11:20.215 "6e97e19f-464e-492c-ac2b-c631ebfbb2db" 00:11:20.215 ], 00:11:20.215 "product_name": "Malloc disk", 00:11:20.215 "block_size": 512, 00:11:20.215 "num_blocks": 65536, 00:11:20.215 "uuid": "6e97e19f-464e-492c-ac2b-c631ebfbb2db", 00:11:20.215 "assigned_rate_limits": { 00:11:20.215 "rw_ios_per_sec": 0, 00:11:20.215 "rw_mbytes_per_sec": 0, 00:11:20.215 "r_mbytes_per_sec": 0, 00:11:20.215 "w_mbytes_per_sec": 0 00:11:20.215 }, 00:11:20.215 "claimed": true, 00:11:20.215 "claim_type": "exclusive_write", 00:11:20.215 "zoned": false, 00:11:20.215 "supported_io_types": { 00:11:20.215 "read": true, 00:11:20.215 "write": true, 00:11:20.215 "unmap": true, 00:11:20.215 "flush": true, 00:11:20.215 "reset": true, 00:11:20.215 "nvme_admin": false, 00:11:20.215 "nvme_io": false, 00:11:20.215 "nvme_io_md": false, 00:11:20.215 "write_zeroes": true, 00:11:20.215 "zcopy": true, 00:11:20.215 "get_zone_info": false, 00:11:20.215 "zone_management": false, 00:11:20.215 "zone_append": false, 00:11:20.215 "compare": false, 00:11:20.215 "compare_and_write": false, 00:11:20.215 "abort": true, 00:11:20.215 "seek_hole": false, 00:11:20.215 "seek_data": false, 00:11:20.215 "copy": true, 00:11:20.215 "nvme_iov_md": false 00:11:20.215 }, 00:11:20.215 "memory_domains": [ 00:11:20.215 { 00:11:20.215 "dma_device_id": "system", 00:11:20.215 "dma_device_type": 1 00:11:20.215 }, 00:11:20.215 { 00:11:20.215 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.215 "dma_device_type": 2 00:11:20.215 } 00:11:20.215 ], 00:11:20.215 "driver_specific": {} 00:11:20.215 }' 00:11:20.215 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:20.216 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:20.474 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:20.474 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:20.474 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:20.474 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:20.474 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:20.474 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:20.474 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:20.474 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:20.474 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:20.732 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:20.732 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:20.732 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:20.732 20:25:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:20.991 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:20.991 "name": "BaseBdev2", 00:11:20.991 "aliases": [ 00:11:20.991 "88b81ecd-6e59-451c-9f18-a28473097391" 00:11:20.991 ], 00:11:20.991 "product_name": "Malloc disk", 00:11:20.991 "block_size": 512, 00:11:20.991 "num_blocks": 65536, 00:11:20.991 "uuid": "88b81ecd-6e59-451c-9f18-a28473097391", 00:11:20.991 "assigned_rate_limits": { 00:11:20.991 "rw_ios_per_sec": 0, 00:11:20.991 "rw_mbytes_per_sec": 0, 00:11:20.991 "r_mbytes_per_sec": 0, 00:11:20.991 "w_mbytes_per_sec": 0 00:11:20.991 }, 00:11:20.991 "claimed": true, 00:11:20.991 "claim_type": "exclusive_write", 00:11:20.991 "zoned": false, 00:11:20.991 "supported_io_types": { 00:11:20.991 "read": true, 00:11:20.991 "write": true, 00:11:20.991 "unmap": true, 00:11:20.991 "flush": true, 00:11:20.991 "reset": true, 00:11:20.991 "nvme_admin": false, 00:11:20.991 "nvme_io": false, 00:11:20.991 "nvme_io_md": false, 00:11:20.991 "write_zeroes": true, 00:11:20.991 "zcopy": true, 00:11:20.991 "get_zone_info": false, 00:11:20.991 "zone_management": false, 00:11:20.991 "zone_append": false, 00:11:20.991 "compare": false, 00:11:20.991 "compare_and_write": false, 00:11:20.991 "abort": true, 00:11:20.991 "seek_hole": false, 00:11:20.991 "seek_data": false, 00:11:20.991 "copy": true, 00:11:20.991 "nvme_iov_md": false 00:11:20.991 }, 00:11:20.991 "memory_domains": [ 00:11:20.991 { 00:11:20.991 "dma_device_id": "system", 00:11:20.991 "dma_device_type": 1 00:11:20.991 }, 00:11:20.991 { 00:11:20.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.991 "dma_device_type": 2 00:11:20.991 } 00:11:20.991 ], 00:11:20.991 "driver_specific": {} 00:11:20.991 }' 00:11:20.991 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:20.991 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:20.991 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:20.991 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:20.991 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:20.991 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:20.991 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:20.991 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.248 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:21.248 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.248 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.248 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:21.248 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:21.507 [2024-07-15 20:25:13.707222] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:21.507 [2024-07-15 20:25:13.707247] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:21.507 [2024-07-15 20:25:13.707288] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:21.507 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:21.766 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:21.766 "name": "Existed_Raid", 00:11:21.766 "uuid": "5d3c6ec8-21e5-48a4-96ae-cb091c9e1966", 00:11:21.766 "strip_size_kb": 64, 00:11:21.766 "state": "offline", 00:11:21.766 "raid_level": "raid0", 00:11:21.766 "superblock": false, 00:11:21.766 "num_base_bdevs": 2, 00:11:21.766 "num_base_bdevs_discovered": 1, 00:11:21.766 "num_base_bdevs_operational": 1, 00:11:21.766 "base_bdevs_list": [ 00:11:21.766 { 00:11:21.766 "name": null, 00:11:21.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:21.766 "is_configured": false, 00:11:21.766 "data_offset": 0, 00:11:21.766 "data_size": 65536 00:11:21.766 }, 00:11:21.766 { 00:11:21.766 "name": "BaseBdev2", 00:11:21.766 "uuid": "88b81ecd-6e59-451c-9f18-a28473097391", 00:11:21.766 "is_configured": true, 00:11:21.766 "data_offset": 0, 00:11:21.766 "data_size": 65536 00:11:21.766 } 00:11:21.766 ] 00:11:21.766 }' 00:11:21.766 20:25:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:21.766 20:25:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:22.333 20:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:22.333 20:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:22.333 20:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.333 20:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:22.592 20:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:22.592 20:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:22.592 20:25:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:22.851 [2024-07-15 20:25:15.059824] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:22.851 [2024-07-15 20:25:15.059873] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7b3000 name Existed_Raid, state offline 00:11:22.851 20:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:22.851 20:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:22.851 20:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.851 20:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1347730 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1347730 ']' 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1347730 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1347730 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1347730' 00:11:23.111 killing process with pid 1347730 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1347730 00:11:23.111 [2024-07-15 20:25:15.387317] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:23.111 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1347730 00:11:23.111 [2024-07-15 20:25:15.388184] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:23.371 00:11:23.371 real 0m11.348s 00:11:23.371 user 0m20.255s 00:11:23.371 sys 0m2.061s 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.371 ************************************ 00:11:23.371 END TEST raid_state_function_test 00:11:23.371 ************************************ 00:11:23.371 20:25:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:23.371 20:25:15 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:11:23.371 20:25:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:23.371 20:25:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:23.371 20:25:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:23.371 ************************************ 00:11:23.371 START TEST raid_state_function_test_sb 00:11:23.371 ************************************ 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1349372 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1349372' 00:11:23.371 Process raid pid: 1349372 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1349372 /var/tmp/spdk-raid.sock 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1349372 ']' 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:23.371 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:23.371 20:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:23.371 [2024-07-15 20:25:15.747392] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:11:23.371 [2024-07-15 20:25:15.747471] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:23.630 [2024-07-15 20:25:15.890976] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:23.630 [2024-07-15 20:25:15.999975] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:23.889 [2024-07-15 20:25:16.067910] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:23.889 [2024-07-15 20:25:16.067972] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:24.457 20:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:24.457 20:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:24.457 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:24.716 [2024-07-15 20:25:16.903216] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:24.716 [2024-07-15 20:25:16.903257] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:24.716 [2024-07-15 20:25:16.903268] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:24.716 [2024-07-15 20:25:16.903280] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.716 20:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:24.975 20:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:24.975 "name": "Existed_Raid", 00:11:24.975 "uuid": "0c2f6589-94ef-4267-b29c-4db481e4c565", 00:11:24.975 "strip_size_kb": 64, 00:11:24.975 "state": "configuring", 00:11:24.975 "raid_level": "raid0", 00:11:24.975 "superblock": true, 00:11:24.975 "num_base_bdevs": 2, 00:11:24.975 "num_base_bdevs_discovered": 0, 00:11:24.975 "num_base_bdevs_operational": 2, 00:11:24.975 "base_bdevs_list": [ 00:11:24.975 { 00:11:24.975 "name": "BaseBdev1", 00:11:24.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:24.975 "is_configured": false, 00:11:24.975 "data_offset": 0, 00:11:24.975 "data_size": 0 00:11:24.975 }, 00:11:24.975 { 00:11:24.975 "name": "BaseBdev2", 00:11:24.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:24.975 "is_configured": false, 00:11:24.975 "data_offset": 0, 00:11:24.975 "data_size": 0 00:11:24.975 } 00:11:24.975 ] 00:11:24.975 }' 00:11:24.975 20:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:24.975 20:25:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:25.543 20:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:25.803 [2024-07-15 20:25:17.961876] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:25.803 [2024-07-15 20:25:17.961905] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfaaa80 name Existed_Raid, state configuring 00:11:25.803 20:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:26.062 [2024-07-15 20:25:18.214566] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:26.062 [2024-07-15 20:25:18.214595] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:26.062 [2024-07-15 20:25:18.214605] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:26.062 [2024-07-15 20:25:18.214616] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:26.062 20:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:26.321 [2024-07-15 20:25:18.485089] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:26.321 BaseBdev1 00:11:26.322 20:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:26.322 20:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:26.322 20:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:26.322 20:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:26.322 20:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:26.322 20:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:26.322 20:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:26.581 20:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:26.840 [ 00:11:26.840 { 00:11:26.840 "name": "BaseBdev1", 00:11:26.840 "aliases": [ 00:11:26.840 "8875bccd-4c36-4ce1-b493-7828f4e6e0a9" 00:11:26.840 ], 00:11:26.840 "product_name": "Malloc disk", 00:11:26.840 "block_size": 512, 00:11:26.840 "num_blocks": 65536, 00:11:26.840 "uuid": "8875bccd-4c36-4ce1-b493-7828f4e6e0a9", 00:11:26.840 "assigned_rate_limits": { 00:11:26.840 "rw_ios_per_sec": 0, 00:11:26.840 "rw_mbytes_per_sec": 0, 00:11:26.840 "r_mbytes_per_sec": 0, 00:11:26.840 "w_mbytes_per_sec": 0 00:11:26.840 }, 00:11:26.840 "claimed": true, 00:11:26.840 "claim_type": "exclusive_write", 00:11:26.840 "zoned": false, 00:11:26.840 "supported_io_types": { 00:11:26.840 "read": true, 00:11:26.840 "write": true, 00:11:26.840 "unmap": true, 00:11:26.840 "flush": true, 00:11:26.840 "reset": true, 00:11:26.840 "nvme_admin": false, 00:11:26.840 "nvme_io": false, 00:11:26.840 "nvme_io_md": false, 00:11:26.840 "write_zeroes": true, 00:11:26.840 "zcopy": true, 00:11:26.840 "get_zone_info": false, 00:11:26.840 "zone_management": false, 00:11:26.840 "zone_append": false, 00:11:26.840 "compare": false, 00:11:26.840 "compare_and_write": false, 00:11:26.840 "abort": true, 00:11:26.840 "seek_hole": false, 00:11:26.840 "seek_data": false, 00:11:26.840 "copy": true, 00:11:26.840 "nvme_iov_md": false 00:11:26.840 }, 00:11:26.840 "memory_domains": [ 00:11:26.840 { 00:11:26.840 "dma_device_id": "system", 00:11:26.840 "dma_device_type": 1 00:11:26.840 }, 00:11:26.840 { 00:11:26.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:26.840 "dma_device_type": 2 00:11:26.840 } 00:11:26.840 ], 00:11:26.840 "driver_specific": {} 00:11:26.840 } 00:11:26.840 ] 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.840 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:27.099 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:27.099 "name": "Existed_Raid", 00:11:27.099 "uuid": "7c828b38-6874-4fed-a20b-d8317efcc20c", 00:11:27.099 "strip_size_kb": 64, 00:11:27.099 "state": "configuring", 00:11:27.099 "raid_level": "raid0", 00:11:27.099 "superblock": true, 00:11:27.099 "num_base_bdevs": 2, 00:11:27.099 "num_base_bdevs_discovered": 1, 00:11:27.099 "num_base_bdevs_operational": 2, 00:11:27.099 "base_bdevs_list": [ 00:11:27.099 { 00:11:27.099 "name": "BaseBdev1", 00:11:27.099 "uuid": "8875bccd-4c36-4ce1-b493-7828f4e6e0a9", 00:11:27.099 "is_configured": true, 00:11:27.099 "data_offset": 2048, 00:11:27.099 "data_size": 63488 00:11:27.099 }, 00:11:27.099 { 00:11:27.099 "name": "BaseBdev2", 00:11:27.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:27.099 "is_configured": false, 00:11:27.099 "data_offset": 0, 00:11:27.099 "data_size": 0 00:11:27.099 } 00:11:27.099 ] 00:11:27.099 }' 00:11:27.099 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:27.099 20:25:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:27.667 20:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:27.926 [2024-07-15 20:25:20.053276] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:27.926 [2024-07-15 20:25:20.053317] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfaa350 name Existed_Raid, state configuring 00:11:27.926 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:28.185 [2024-07-15 20:25:20.314012] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:28.185 [2024-07-15 20:25:20.315502] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:28.185 [2024-07-15 20:25:20.315533] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:28.185 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:28.444 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:28.444 "name": "Existed_Raid", 00:11:28.444 "uuid": "fd3ea74f-3f77-4fed-8417-e5bb4befa803", 00:11:28.444 "strip_size_kb": 64, 00:11:28.444 "state": "configuring", 00:11:28.444 "raid_level": "raid0", 00:11:28.444 "superblock": true, 00:11:28.444 "num_base_bdevs": 2, 00:11:28.444 "num_base_bdevs_discovered": 1, 00:11:28.444 "num_base_bdevs_operational": 2, 00:11:28.444 "base_bdevs_list": [ 00:11:28.444 { 00:11:28.444 "name": "BaseBdev1", 00:11:28.444 "uuid": "8875bccd-4c36-4ce1-b493-7828f4e6e0a9", 00:11:28.444 "is_configured": true, 00:11:28.444 "data_offset": 2048, 00:11:28.444 "data_size": 63488 00:11:28.444 }, 00:11:28.444 { 00:11:28.444 "name": "BaseBdev2", 00:11:28.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.444 "is_configured": false, 00:11:28.444 "data_offset": 0, 00:11:28.444 "data_size": 0 00:11:28.444 } 00:11:28.444 ] 00:11:28.444 }' 00:11:28.444 20:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:28.444 20:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:29.012 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:29.271 [2024-07-15 20:25:21.420366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:29.271 [2024-07-15 20:25:21.420514] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfab000 00:11:29.271 [2024-07-15 20:25:21.420528] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:29.271 [2024-07-15 20:25:21.420699] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xec50c0 00:11:29.271 [2024-07-15 20:25:21.420813] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfab000 00:11:29.271 [2024-07-15 20:25:21.420823] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfab000 00:11:29.271 [2024-07-15 20:25:21.420912] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:29.271 BaseBdev2 00:11:29.271 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:29.271 20:25:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:29.271 20:25:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:29.271 20:25:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:29.271 20:25:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:29.271 20:25:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:29.271 20:25:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:29.530 20:25:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:29.789 [ 00:11:29.789 { 00:11:29.789 "name": "BaseBdev2", 00:11:29.789 "aliases": [ 00:11:29.789 "596b562f-a8e9-483b-9f1c-03bff876b2c5" 00:11:29.789 ], 00:11:29.789 "product_name": "Malloc disk", 00:11:29.789 "block_size": 512, 00:11:29.789 "num_blocks": 65536, 00:11:29.789 "uuid": "596b562f-a8e9-483b-9f1c-03bff876b2c5", 00:11:29.789 "assigned_rate_limits": { 00:11:29.789 "rw_ios_per_sec": 0, 00:11:29.789 "rw_mbytes_per_sec": 0, 00:11:29.789 "r_mbytes_per_sec": 0, 00:11:29.789 "w_mbytes_per_sec": 0 00:11:29.789 }, 00:11:29.789 "claimed": true, 00:11:29.789 "claim_type": "exclusive_write", 00:11:29.789 "zoned": false, 00:11:29.789 "supported_io_types": { 00:11:29.789 "read": true, 00:11:29.789 "write": true, 00:11:29.789 "unmap": true, 00:11:29.789 "flush": true, 00:11:29.789 "reset": true, 00:11:29.789 "nvme_admin": false, 00:11:29.789 "nvme_io": false, 00:11:29.789 "nvme_io_md": false, 00:11:29.789 "write_zeroes": true, 00:11:29.789 "zcopy": true, 00:11:29.789 "get_zone_info": false, 00:11:29.789 "zone_management": false, 00:11:29.789 "zone_append": false, 00:11:29.789 "compare": false, 00:11:29.789 "compare_and_write": false, 00:11:29.789 "abort": true, 00:11:29.789 "seek_hole": false, 00:11:29.789 "seek_data": false, 00:11:29.789 "copy": true, 00:11:29.789 "nvme_iov_md": false 00:11:29.789 }, 00:11:29.789 "memory_domains": [ 00:11:29.789 { 00:11:29.789 "dma_device_id": "system", 00:11:29.789 "dma_device_type": 1 00:11:29.789 }, 00:11:29.789 { 00:11:29.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:29.789 "dma_device_type": 2 00:11:29.789 } 00:11:29.789 ], 00:11:29.789 "driver_specific": {} 00:11:29.789 } 00:11:29.789 ] 00:11:29.789 20:25:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:29.789 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:29.789 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:29.789 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:29.789 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:29.790 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:29.790 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:29.790 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:29.790 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:29.790 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:29.790 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:29.790 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:29.790 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:29.790 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.790 20:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:30.049 20:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:30.049 "name": "Existed_Raid", 00:11:30.049 "uuid": "fd3ea74f-3f77-4fed-8417-e5bb4befa803", 00:11:30.049 "strip_size_kb": 64, 00:11:30.049 "state": "online", 00:11:30.049 "raid_level": "raid0", 00:11:30.049 "superblock": true, 00:11:30.049 "num_base_bdevs": 2, 00:11:30.049 "num_base_bdevs_discovered": 2, 00:11:30.049 "num_base_bdevs_operational": 2, 00:11:30.049 "base_bdevs_list": [ 00:11:30.049 { 00:11:30.049 "name": "BaseBdev1", 00:11:30.049 "uuid": "8875bccd-4c36-4ce1-b493-7828f4e6e0a9", 00:11:30.049 "is_configured": true, 00:11:30.049 "data_offset": 2048, 00:11:30.049 "data_size": 63488 00:11:30.049 }, 00:11:30.049 { 00:11:30.049 "name": "BaseBdev2", 00:11:30.049 "uuid": "596b562f-a8e9-483b-9f1c-03bff876b2c5", 00:11:30.049 "is_configured": true, 00:11:30.049 "data_offset": 2048, 00:11:30.049 "data_size": 63488 00:11:30.049 } 00:11:30.049 ] 00:11:30.049 }' 00:11:30.049 20:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:30.049 20:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:30.617 20:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:30.617 20:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:30.617 20:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:30.617 20:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:30.617 20:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:30.617 20:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:30.617 20:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:30.617 20:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:30.875 [2024-07-15 20:25:23.028921] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:30.875 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:30.875 "name": "Existed_Raid", 00:11:30.875 "aliases": [ 00:11:30.875 "fd3ea74f-3f77-4fed-8417-e5bb4befa803" 00:11:30.875 ], 00:11:30.875 "product_name": "Raid Volume", 00:11:30.875 "block_size": 512, 00:11:30.875 "num_blocks": 126976, 00:11:30.875 "uuid": "fd3ea74f-3f77-4fed-8417-e5bb4befa803", 00:11:30.875 "assigned_rate_limits": { 00:11:30.875 "rw_ios_per_sec": 0, 00:11:30.875 "rw_mbytes_per_sec": 0, 00:11:30.875 "r_mbytes_per_sec": 0, 00:11:30.875 "w_mbytes_per_sec": 0 00:11:30.875 }, 00:11:30.875 "claimed": false, 00:11:30.875 "zoned": false, 00:11:30.875 "supported_io_types": { 00:11:30.875 "read": true, 00:11:30.876 "write": true, 00:11:30.876 "unmap": true, 00:11:30.876 "flush": true, 00:11:30.876 "reset": true, 00:11:30.876 "nvme_admin": false, 00:11:30.876 "nvme_io": false, 00:11:30.876 "nvme_io_md": false, 00:11:30.876 "write_zeroes": true, 00:11:30.876 "zcopy": false, 00:11:30.876 "get_zone_info": false, 00:11:30.876 "zone_management": false, 00:11:30.876 "zone_append": false, 00:11:30.876 "compare": false, 00:11:30.876 "compare_and_write": false, 00:11:30.876 "abort": false, 00:11:30.876 "seek_hole": false, 00:11:30.876 "seek_data": false, 00:11:30.876 "copy": false, 00:11:30.876 "nvme_iov_md": false 00:11:30.876 }, 00:11:30.876 "memory_domains": [ 00:11:30.876 { 00:11:30.876 "dma_device_id": "system", 00:11:30.876 "dma_device_type": 1 00:11:30.876 }, 00:11:30.876 { 00:11:30.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:30.876 "dma_device_type": 2 00:11:30.876 }, 00:11:30.876 { 00:11:30.876 "dma_device_id": "system", 00:11:30.876 "dma_device_type": 1 00:11:30.876 }, 00:11:30.876 { 00:11:30.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:30.876 "dma_device_type": 2 00:11:30.876 } 00:11:30.876 ], 00:11:30.876 "driver_specific": { 00:11:30.876 "raid": { 00:11:30.876 "uuid": "fd3ea74f-3f77-4fed-8417-e5bb4befa803", 00:11:30.876 "strip_size_kb": 64, 00:11:30.876 "state": "online", 00:11:30.876 "raid_level": "raid0", 00:11:30.876 "superblock": true, 00:11:30.876 "num_base_bdevs": 2, 00:11:30.876 "num_base_bdevs_discovered": 2, 00:11:30.876 "num_base_bdevs_operational": 2, 00:11:30.876 "base_bdevs_list": [ 00:11:30.876 { 00:11:30.876 "name": "BaseBdev1", 00:11:30.876 "uuid": "8875bccd-4c36-4ce1-b493-7828f4e6e0a9", 00:11:30.876 "is_configured": true, 00:11:30.876 "data_offset": 2048, 00:11:30.876 "data_size": 63488 00:11:30.876 }, 00:11:30.876 { 00:11:30.876 "name": "BaseBdev2", 00:11:30.876 "uuid": "596b562f-a8e9-483b-9f1c-03bff876b2c5", 00:11:30.876 "is_configured": true, 00:11:30.876 "data_offset": 2048, 00:11:30.876 "data_size": 63488 00:11:30.876 } 00:11:30.876 ] 00:11:30.876 } 00:11:30.876 } 00:11:30.876 }' 00:11:30.876 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:30.876 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:30.876 BaseBdev2' 00:11:30.876 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:30.876 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:30.876 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:31.134 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:31.134 "name": "BaseBdev1", 00:11:31.134 "aliases": [ 00:11:31.134 "8875bccd-4c36-4ce1-b493-7828f4e6e0a9" 00:11:31.134 ], 00:11:31.134 "product_name": "Malloc disk", 00:11:31.134 "block_size": 512, 00:11:31.134 "num_blocks": 65536, 00:11:31.134 "uuid": "8875bccd-4c36-4ce1-b493-7828f4e6e0a9", 00:11:31.134 "assigned_rate_limits": { 00:11:31.134 "rw_ios_per_sec": 0, 00:11:31.134 "rw_mbytes_per_sec": 0, 00:11:31.134 "r_mbytes_per_sec": 0, 00:11:31.134 "w_mbytes_per_sec": 0 00:11:31.134 }, 00:11:31.134 "claimed": true, 00:11:31.134 "claim_type": "exclusive_write", 00:11:31.134 "zoned": false, 00:11:31.134 "supported_io_types": { 00:11:31.134 "read": true, 00:11:31.134 "write": true, 00:11:31.134 "unmap": true, 00:11:31.134 "flush": true, 00:11:31.134 "reset": true, 00:11:31.134 "nvme_admin": false, 00:11:31.134 "nvme_io": false, 00:11:31.134 "nvme_io_md": false, 00:11:31.134 "write_zeroes": true, 00:11:31.134 "zcopy": true, 00:11:31.134 "get_zone_info": false, 00:11:31.134 "zone_management": false, 00:11:31.134 "zone_append": false, 00:11:31.134 "compare": false, 00:11:31.134 "compare_and_write": false, 00:11:31.134 "abort": true, 00:11:31.134 "seek_hole": false, 00:11:31.134 "seek_data": false, 00:11:31.134 "copy": true, 00:11:31.134 "nvme_iov_md": false 00:11:31.134 }, 00:11:31.134 "memory_domains": [ 00:11:31.134 { 00:11:31.134 "dma_device_id": "system", 00:11:31.134 "dma_device_type": 1 00:11:31.134 }, 00:11:31.134 { 00:11:31.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.134 "dma_device_type": 2 00:11:31.134 } 00:11:31.134 ], 00:11:31.134 "driver_specific": {} 00:11:31.134 }' 00:11:31.134 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:31.134 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:31.134 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:31.134 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:31.134 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:31.392 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:31.392 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:31.392 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:31.392 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:31.392 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:31.392 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:31.392 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:31.392 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:31.392 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:31.392 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:31.650 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:31.650 "name": "BaseBdev2", 00:11:31.650 "aliases": [ 00:11:31.650 "596b562f-a8e9-483b-9f1c-03bff876b2c5" 00:11:31.650 ], 00:11:31.650 "product_name": "Malloc disk", 00:11:31.650 "block_size": 512, 00:11:31.650 "num_blocks": 65536, 00:11:31.650 "uuid": "596b562f-a8e9-483b-9f1c-03bff876b2c5", 00:11:31.650 "assigned_rate_limits": { 00:11:31.650 "rw_ios_per_sec": 0, 00:11:31.650 "rw_mbytes_per_sec": 0, 00:11:31.650 "r_mbytes_per_sec": 0, 00:11:31.650 "w_mbytes_per_sec": 0 00:11:31.650 }, 00:11:31.650 "claimed": true, 00:11:31.650 "claim_type": "exclusive_write", 00:11:31.650 "zoned": false, 00:11:31.650 "supported_io_types": { 00:11:31.650 "read": true, 00:11:31.650 "write": true, 00:11:31.650 "unmap": true, 00:11:31.650 "flush": true, 00:11:31.650 "reset": true, 00:11:31.650 "nvme_admin": false, 00:11:31.650 "nvme_io": false, 00:11:31.650 "nvme_io_md": false, 00:11:31.650 "write_zeroes": true, 00:11:31.650 "zcopy": true, 00:11:31.650 "get_zone_info": false, 00:11:31.650 "zone_management": false, 00:11:31.650 "zone_append": false, 00:11:31.650 "compare": false, 00:11:31.650 "compare_and_write": false, 00:11:31.650 "abort": true, 00:11:31.650 "seek_hole": false, 00:11:31.650 "seek_data": false, 00:11:31.650 "copy": true, 00:11:31.650 "nvme_iov_md": false 00:11:31.650 }, 00:11:31.650 "memory_domains": [ 00:11:31.650 { 00:11:31.650 "dma_device_id": "system", 00:11:31.650 "dma_device_type": 1 00:11:31.650 }, 00:11:31.650 { 00:11:31.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.650 "dma_device_type": 2 00:11:31.650 } 00:11:31.650 ], 00:11:31.650 "driver_specific": {} 00:11:31.650 }' 00:11:31.650 20:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:31.650 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:31.909 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:31.909 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:31.909 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:31.909 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:31.909 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:31.909 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:31.909 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:31.909 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:31.909 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:32.167 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:32.167 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:32.167 [2024-07-15 20:25:24.540714] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:32.168 [2024-07-15 20:25:24.540741] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:32.168 [2024-07-15 20:25:24.540784] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.426 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:32.697 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.697 "name": "Existed_Raid", 00:11:32.697 "uuid": "fd3ea74f-3f77-4fed-8417-e5bb4befa803", 00:11:32.697 "strip_size_kb": 64, 00:11:32.697 "state": "offline", 00:11:32.697 "raid_level": "raid0", 00:11:32.697 "superblock": true, 00:11:32.697 "num_base_bdevs": 2, 00:11:32.697 "num_base_bdevs_discovered": 1, 00:11:32.697 "num_base_bdevs_operational": 1, 00:11:32.697 "base_bdevs_list": [ 00:11:32.697 { 00:11:32.697 "name": null, 00:11:32.697 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:32.697 "is_configured": false, 00:11:32.697 "data_offset": 2048, 00:11:32.697 "data_size": 63488 00:11:32.697 }, 00:11:32.697 { 00:11:32.697 "name": "BaseBdev2", 00:11:32.697 "uuid": "596b562f-a8e9-483b-9f1c-03bff876b2c5", 00:11:32.697 "is_configured": true, 00:11:32.697 "data_offset": 2048, 00:11:32.697 "data_size": 63488 00:11:32.697 } 00:11:32.697 ] 00:11:32.697 }' 00:11:32.697 20:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.697 20:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:33.262 20:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:33.262 20:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:33.262 20:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.262 20:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:33.862 20:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:33.862 20:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:33.862 20:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:33.862 [2024-07-15 20:25:26.174614] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:33.862 [2024-07-15 20:25:26.174662] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfab000 name Existed_Raid, state offline 00:11:33.862 20:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:33.862 20:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:33.862 20:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.862 20:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:34.121 20:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:34.121 20:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:34.121 20:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:34.121 20:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1349372 00:11:34.121 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1349372 ']' 00:11:34.121 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1349372 00:11:34.121 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:34.121 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:34.121 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1349372 00:11:34.380 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:34.380 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:34.380 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1349372' 00:11:34.380 killing process with pid 1349372 00:11:34.380 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1349372 00:11:34.380 [2024-07-15 20:25:26.532327] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:34.380 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1349372 00:11:34.380 [2024-07-15 20:25:26.533233] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:34.380 20:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:34.380 00:11:34.380 real 0m11.073s 00:11:34.380 user 0m19.691s 00:11:34.380 sys 0m2.063s 00:11:34.380 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:34.380 20:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:34.380 ************************************ 00:11:34.380 END TEST raid_state_function_test_sb 00:11:34.380 ************************************ 00:11:34.638 20:25:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:34.639 20:25:26 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:11:34.639 20:25:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:34.639 20:25:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:34.639 20:25:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:34.639 ************************************ 00:11:34.639 START TEST raid_superblock_test 00:11:34.639 ************************************ 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1351068 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1351068 /var/tmp/spdk-raid.sock 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1351068 ']' 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:34.639 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:34.639 20:25:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.639 [2024-07-15 20:25:26.896475] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:11:34.639 [2024-07-15 20:25:26.896558] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1351068 ] 00:11:34.898 [2024-07-15 20:25:27.042225] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:34.898 [2024-07-15 20:25:27.148375] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:34.898 [2024-07-15 20:25:27.218966] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:34.898 [2024-07-15 20:25:27.219005] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:35.466 20:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:35.725 malloc1 00:11:35.725 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:35.984 [2024-07-15 20:25:28.293743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:35.984 [2024-07-15 20:25:28.293793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:35.984 [2024-07-15 20:25:28.293815] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c42570 00:11:35.984 [2024-07-15 20:25:28.293827] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:35.984 [2024-07-15 20:25:28.295570] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:35.984 [2024-07-15 20:25:28.295603] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:35.984 pt1 00:11:35.984 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:35.984 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:35.984 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:35.984 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:35.984 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:35.984 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:35.984 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:35.984 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:35.984 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:36.244 malloc2 00:11:36.244 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:36.502 [2024-07-15 20:25:28.781027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:36.502 [2024-07-15 20:25:28.781072] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:36.502 [2024-07-15 20:25:28.781090] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c43970 00:11:36.502 [2024-07-15 20:25:28.781102] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:36.502 [2024-07-15 20:25:28.782707] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:36.502 [2024-07-15 20:25:28.782737] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:36.502 pt2 00:11:36.502 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:36.502 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:36.502 20:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:36.760 [2024-07-15 20:25:29.025697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:36.760 [2024-07-15 20:25:29.027062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:36.760 [2024-07-15 20:25:29.027208] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1de6270 00:11:36.760 [2024-07-15 20:25:29.027222] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:36.761 [2024-07-15 20:25:29.027423] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ddbc10 00:11:36.761 [2024-07-15 20:25:29.027570] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1de6270 00:11:36.761 [2024-07-15 20:25:29.027580] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1de6270 00:11:36.761 [2024-07-15 20:25:29.027682] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:36.761 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.019 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.019 "name": "raid_bdev1", 00:11:37.019 "uuid": "703edfff-31e2-48a9-a785-0b1106565922", 00:11:37.019 "strip_size_kb": 64, 00:11:37.019 "state": "online", 00:11:37.019 "raid_level": "raid0", 00:11:37.019 "superblock": true, 00:11:37.019 "num_base_bdevs": 2, 00:11:37.019 "num_base_bdevs_discovered": 2, 00:11:37.019 "num_base_bdevs_operational": 2, 00:11:37.019 "base_bdevs_list": [ 00:11:37.019 { 00:11:37.019 "name": "pt1", 00:11:37.019 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:37.019 "is_configured": true, 00:11:37.019 "data_offset": 2048, 00:11:37.019 "data_size": 63488 00:11:37.019 }, 00:11:37.019 { 00:11:37.019 "name": "pt2", 00:11:37.019 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:37.019 "is_configured": true, 00:11:37.019 "data_offset": 2048, 00:11:37.019 "data_size": 63488 00:11:37.019 } 00:11:37.019 ] 00:11:37.019 }' 00:11:37.019 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.019 20:25:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:37.587 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:37.588 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:37.588 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:37.588 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:37.588 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:37.588 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:37.588 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:37.588 20:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:37.847 [2024-07-15 20:25:30.140873] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:37.847 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:37.847 "name": "raid_bdev1", 00:11:37.847 "aliases": [ 00:11:37.847 "703edfff-31e2-48a9-a785-0b1106565922" 00:11:37.847 ], 00:11:37.847 "product_name": "Raid Volume", 00:11:37.847 "block_size": 512, 00:11:37.847 "num_blocks": 126976, 00:11:37.847 "uuid": "703edfff-31e2-48a9-a785-0b1106565922", 00:11:37.847 "assigned_rate_limits": { 00:11:37.847 "rw_ios_per_sec": 0, 00:11:37.847 "rw_mbytes_per_sec": 0, 00:11:37.847 "r_mbytes_per_sec": 0, 00:11:37.847 "w_mbytes_per_sec": 0 00:11:37.847 }, 00:11:37.847 "claimed": false, 00:11:37.847 "zoned": false, 00:11:37.847 "supported_io_types": { 00:11:37.847 "read": true, 00:11:37.847 "write": true, 00:11:37.847 "unmap": true, 00:11:37.847 "flush": true, 00:11:37.847 "reset": true, 00:11:37.847 "nvme_admin": false, 00:11:37.847 "nvme_io": false, 00:11:37.847 "nvme_io_md": false, 00:11:37.847 "write_zeroes": true, 00:11:37.847 "zcopy": false, 00:11:37.847 "get_zone_info": false, 00:11:37.847 "zone_management": false, 00:11:37.847 "zone_append": false, 00:11:37.847 "compare": false, 00:11:37.847 "compare_and_write": false, 00:11:37.847 "abort": false, 00:11:37.847 "seek_hole": false, 00:11:37.847 "seek_data": false, 00:11:37.847 "copy": false, 00:11:37.847 "nvme_iov_md": false 00:11:37.847 }, 00:11:37.847 "memory_domains": [ 00:11:37.847 { 00:11:37.847 "dma_device_id": "system", 00:11:37.847 "dma_device_type": 1 00:11:37.847 }, 00:11:37.847 { 00:11:37.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.847 "dma_device_type": 2 00:11:37.847 }, 00:11:37.847 { 00:11:37.847 "dma_device_id": "system", 00:11:37.847 "dma_device_type": 1 00:11:37.847 }, 00:11:37.847 { 00:11:37.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.847 "dma_device_type": 2 00:11:37.847 } 00:11:37.847 ], 00:11:37.847 "driver_specific": { 00:11:37.847 "raid": { 00:11:37.847 "uuid": "703edfff-31e2-48a9-a785-0b1106565922", 00:11:37.847 "strip_size_kb": 64, 00:11:37.847 "state": "online", 00:11:37.847 "raid_level": "raid0", 00:11:37.847 "superblock": true, 00:11:37.847 "num_base_bdevs": 2, 00:11:37.847 "num_base_bdevs_discovered": 2, 00:11:37.847 "num_base_bdevs_operational": 2, 00:11:37.847 "base_bdevs_list": [ 00:11:37.847 { 00:11:37.847 "name": "pt1", 00:11:37.847 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:37.847 "is_configured": true, 00:11:37.847 "data_offset": 2048, 00:11:37.847 "data_size": 63488 00:11:37.847 }, 00:11:37.847 { 00:11:37.847 "name": "pt2", 00:11:37.847 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:37.847 "is_configured": true, 00:11:37.847 "data_offset": 2048, 00:11:37.847 "data_size": 63488 00:11:37.847 } 00:11:37.847 ] 00:11:37.847 } 00:11:37.847 } 00:11:37.847 }' 00:11:37.847 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:38.106 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:38.106 pt2' 00:11:38.106 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:38.106 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:38.106 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:38.106 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:38.106 "name": "pt1", 00:11:38.106 "aliases": [ 00:11:38.106 "00000000-0000-0000-0000-000000000001" 00:11:38.106 ], 00:11:38.106 "product_name": "passthru", 00:11:38.106 "block_size": 512, 00:11:38.106 "num_blocks": 65536, 00:11:38.106 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:38.106 "assigned_rate_limits": { 00:11:38.106 "rw_ios_per_sec": 0, 00:11:38.106 "rw_mbytes_per_sec": 0, 00:11:38.106 "r_mbytes_per_sec": 0, 00:11:38.106 "w_mbytes_per_sec": 0 00:11:38.106 }, 00:11:38.106 "claimed": true, 00:11:38.106 "claim_type": "exclusive_write", 00:11:38.106 "zoned": false, 00:11:38.106 "supported_io_types": { 00:11:38.106 "read": true, 00:11:38.106 "write": true, 00:11:38.106 "unmap": true, 00:11:38.106 "flush": true, 00:11:38.106 "reset": true, 00:11:38.106 "nvme_admin": false, 00:11:38.106 "nvme_io": false, 00:11:38.106 "nvme_io_md": false, 00:11:38.106 "write_zeroes": true, 00:11:38.106 "zcopy": true, 00:11:38.106 "get_zone_info": false, 00:11:38.106 "zone_management": false, 00:11:38.106 "zone_append": false, 00:11:38.106 "compare": false, 00:11:38.106 "compare_and_write": false, 00:11:38.106 "abort": true, 00:11:38.106 "seek_hole": false, 00:11:38.106 "seek_data": false, 00:11:38.106 "copy": true, 00:11:38.106 "nvme_iov_md": false 00:11:38.106 }, 00:11:38.106 "memory_domains": [ 00:11:38.107 { 00:11:38.107 "dma_device_id": "system", 00:11:38.107 "dma_device_type": 1 00:11:38.107 }, 00:11:38.107 { 00:11:38.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.107 "dma_device_type": 2 00:11:38.107 } 00:11:38.107 ], 00:11:38.107 "driver_specific": { 00:11:38.107 "passthru": { 00:11:38.107 "name": "pt1", 00:11:38.107 "base_bdev_name": "malloc1" 00:11:38.107 } 00:11:38.107 } 00:11:38.107 }' 00:11:38.107 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.366 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.366 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:38.366 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.366 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.366 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:38.366 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.625 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.625 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:38.625 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.625 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.625 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:38.625 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:38.625 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:38.625 20:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:39.193 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:39.193 "name": "pt2", 00:11:39.193 "aliases": [ 00:11:39.193 "00000000-0000-0000-0000-000000000002" 00:11:39.193 ], 00:11:39.193 "product_name": "passthru", 00:11:39.193 "block_size": 512, 00:11:39.193 "num_blocks": 65536, 00:11:39.193 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:39.193 "assigned_rate_limits": { 00:11:39.193 "rw_ios_per_sec": 0, 00:11:39.193 "rw_mbytes_per_sec": 0, 00:11:39.193 "r_mbytes_per_sec": 0, 00:11:39.193 "w_mbytes_per_sec": 0 00:11:39.193 }, 00:11:39.193 "claimed": true, 00:11:39.193 "claim_type": "exclusive_write", 00:11:39.193 "zoned": false, 00:11:39.193 "supported_io_types": { 00:11:39.193 "read": true, 00:11:39.193 "write": true, 00:11:39.193 "unmap": true, 00:11:39.193 "flush": true, 00:11:39.193 "reset": true, 00:11:39.193 "nvme_admin": false, 00:11:39.193 "nvme_io": false, 00:11:39.193 "nvme_io_md": false, 00:11:39.193 "write_zeroes": true, 00:11:39.193 "zcopy": true, 00:11:39.193 "get_zone_info": false, 00:11:39.193 "zone_management": false, 00:11:39.193 "zone_append": false, 00:11:39.193 "compare": false, 00:11:39.193 "compare_and_write": false, 00:11:39.193 "abort": true, 00:11:39.193 "seek_hole": false, 00:11:39.193 "seek_data": false, 00:11:39.193 "copy": true, 00:11:39.193 "nvme_iov_md": false 00:11:39.193 }, 00:11:39.194 "memory_domains": [ 00:11:39.194 { 00:11:39.194 "dma_device_id": "system", 00:11:39.194 "dma_device_type": 1 00:11:39.194 }, 00:11:39.194 { 00:11:39.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.194 "dma_device_type": 2 00:11:39.194 } 00:11:39.194 ], 00:11:39.194 "driver_specific": { 00:11:39.194 "passthru": { 00:11:39.194 "name": "pt2", 00:11:39.194 "base_bdev_name": "malloc2" 00:11:39.194 } 00:11:39.194 } 00:11:39.194 }' 00:11:39.194 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.194 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.194 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:39.194 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.453 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.453 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:39.453 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.453 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.453 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:39.453 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.453 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.712 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:39.712 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:39.712 20:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:39.712 [2024-07-15 20:25:32.057951] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:39.712 20:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=703edfff-31e2-48a9-a785-0b1106565922 00:11:39.712 20:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 703edfff-31e2-48a9-a785-0b1106565922 ']' 00:11:39.712 20:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:40.276 [2024-07-15 20:25:32.567050] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:40.276 [2024-07-15 20:25:32.567072] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:40.276 [2024-07-15 20:25:32.567124] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:40.276 [2024-07-15 20:25:32.567168] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:40.276 [2024-07-15 20:25:32.567179] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1de6270 name raid_bdev1, state offline 00:11:40.276 20:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.276 20:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:40.842 20:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:40.842 20:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:40.842 20:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:40.843 20:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:41.409 20:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:41.409 20:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:41.975 20:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:41.975 20:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:42.539 20:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:42.796 [2024-07-15 20:25:35.145750] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:42.796 [2024-07-15 20:25:35.147098] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:42.796 [2024-07-15 20:25:35.147151] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:42.796 [2024-07-15 20:25:35.147191] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:42.796 [2024-07-15 20:25:35.147210] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:42.796 [2024-07-15 20:25:35.147221] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1de5ff0 name raid_bdev1, state configuring 00:11:42.796 request: 00:11:42.796 { 00:11:42.796 "name": "raid_bdev1", 00:11:42.796 "raid_level": "raid0", 00:11:42.796 "base_bdevs": [ 00:11:42.796 "malloc1", 00:11:42.796 "malloc2" 00:11:42.796 ], 00:11:42.796 "strip_size_kb": 64, 00:11:42.796 "superblock": false, 00:11:42.796 "method": "bdev_raid_create", 00:11:42.796 "req_id": 1 00:11:42.796 } 00:11:42.796 Got JSON-RPC error response 00:11:42.796 response: 00:11:42.796 { 00:11:42.796 "code": -17, 00:11:42.796 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:42.796 } 00:11:43.054 20:25:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:43.054 20:25:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:43.054 20:25:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:43.054 20:25:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:43.054 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.054 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:43.054 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:43.054 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:43.054 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:43.621 [2024-07-15 20:25:35.903677] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:43.621 [2024-07-15 20:25:35.903724] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:43.621 [2024-07-15 20:25:35.903746] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c427a0 00:11:43.621 [2024-07-15 20:25:35.903758] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:43.621 [2024-07-15 20:25:35.905385] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:43.621 [2024-07-15 20:25:35.905415] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:43.621 [2024-07-15 20:25:35.905478] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:43.621 [2024-07-15 20:25:35.905503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:43.621 pt1 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.621 20:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:44.187 20:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.187 "name": "raid_bdev1", 00:11:44.187 "uuid": "703edfff-31e2-48a9-a785-0b1106565922", 00:11:44.187 "strip_size_kb": 64, 00:11:44.187 "state": "configuring", 00:11:44.187 "raid_level": "raid0", 00:11:44.187 "superblock": true, 00:11:44.187 "num_base_bdevs": 2, 00:11:44.187 "num_base_bdevs_discovered": 1, 00:11:44.187 "num_base_bdevs_operational": 2, 00:11:44.187 "base_bdevs_list": [ 00:11:44.187 { 00:11:44.187 "name": "pt1", 00:11:44.187 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:44.187 "is_configured": true, 00:11:44.187 "data_offset": 2048, 00:11:44.187 "data_size": 63488 00:11:44.187 }, 00:11:44.187 { 00:11:44.187 "name": null, 00:11:44.187 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:44.187 "is_configured": false, 00:11:44.187 "data_offset": 2048, 00:11:44.187 "data_size": 63488 00:11:44.187 } 00:11:44.187 ] 00:11:44.188 }' 00:11:44.188 20:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.188 20:25:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.126 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:45.126 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:45.126 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:45.126 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:45.384 [2024-07-15 20:25:37.640337] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:45.384 [2024-07-15 20:25:37.640406] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:45.384 [2024-07-15 20:25:37.640434] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ddc820 00:11:45.384 [2024-07-15 20:25:37.640451] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:45.384 [2024-07-15 20:25:37.640972] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:45.384 [2024-07-15 20:25:37.641003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:45.384 [2024-07-15 20:25:37.641109] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:45.384 [2024-07-15 20:25:37.641140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:45.384 [2024-07-15 20:25:37.641276] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c38ec0 00:11:45.384 [2024-07-15 20:25:37.641288] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:45.384 [2024-07-15 20:25:37.641475] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c3b530 00:11:45.384 [2024-07-15 20:25:37.641609] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c38ec0 00:11:45.384 [2024-07-15 20:25:37.641619] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c38ec0 00:11:45.384 [2024-07-15 20:25:37.641731] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:45.384 pt2 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.384 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:45.642 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:45.642 "name": "raid_bdev1", 00:11:45.642 "uuid": "703edfff-31e2-48a9-a785-0b1106565922", 00:11:45.642 "strip_size_kb": 64, 00:11:45.642 "state": "online", 00:11:45.642 "raid_level": "raid0", 00:11:45.642 "superblock": true, 00:11:45.642 "num_base_bdevs": 2, 00:11:45.642 "num_base_bdevs_discovered": 2, 00:11:45.642 "num_base_bdevs_operational": 2, 00:11:45.642 "base_bdevs_list": [ 00:11:45.642 { 00:11:45.642 "name": "pt1", 00:11:45.642 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:45.642 "is_configured": true, 00:11:45.642 "data_offset": 2048, 00:11:45.642 "data_size": 63488 00:11:45.642 }, 00:11:45.642 { 00:11:45.642 "name": "pt2", 00:11:45.642 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:45.642 "is_configured": true, 00:11:45.642 "data_offset": 2048, 00:11:45.642 "data_size": 63488 00:11:45.642 } 00:11:45.642 ] 00:11:45.642 }' 00:11:45.642 20:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:45.642 20:25:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:46.208 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:46.208 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:46.208 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:46.208 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:46.208 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:46.208 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:46.208 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:46.208 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:46.469 [2024-07-15 20:25:38.695384] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:46.469 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:46.469 "name": "raid_bdev1", 00:11:46.469 "aliases": [ 00:11:46.469 "703edfff-31e2-48a9-a785-0b1106565922" 00:11:46.469 ], 00:11:46.469 "product_name": "Raid Volume", 00:11:46.469 "block_size": 512, 00:11:46.469 "num_blocks": 126976, 00:11:46.469 "uuid": "703edfff-31e2-48a9-a785-0b1106565922", 00:11:46.469 "assigned_rate_limits": { 00:11:46.469 "rw_ios_per_sec": 0, 00:11:46.469 "rw_mbytes_per_sec": 0, 00:11:46.469 "r_mbytes_per_sec": 0, 00:11:46.469 "w_mbytes_per_sec": 0 00:11:46.469 }, 00:11:46.469 "claimed": false, 00:11:46.469 "zoned": false, 00:11:46.469 "supported_io_types": { 00:11:46.469 "read": true, 00:11:46.469 "write": true, 00:11:46.469 "unmap": true, 00:11:46.469 "flush": true, 00:11:46.469 "reset": true, 00:11:46.469 "nvme_admin": false, 00:11:46.469 "nvme_io": false, 00:11:46.469 "nvme_io_md": false, 00:11:46.469 "write_zeroes": true, 00:11:46.469 "zcopy": false, 00:11:46.469 "get_zone_info": false, 00:11:46.469 "zone_management": false, 00:11:46.469 "zone_append": false, 00:11:46.469 "compare": false, 00:11:46.469 "compare_and_write": false, 00:11:46.469 "abort": false, 00:11:46.469 "seek_hole": false, 00:11:46.469 "seek_data": false, 00:11:46.469 "copy": false, 00:11:46.469 "nvme_iov_md": false 00:11:46.469 }, 00:11:46.469 "memory_domains": [ 00:11:46.469 { 00:11:46.469 "dma_device_id": "system", 00:11:46.469 "dma_device_type": 1 00:11:46.469 }, 00:11:46.469 { 00:11:46.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.469 "dma_device_type": 2 00:11:46.469 }, 00:11:46.469 { 00:11:46.469 "dma_device_id": "system", 00:11:46.469 "dma_device_type": 1 00:11:46.469 }, 00:11:46.469 { 00:11:46.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.469 "dma_device_type": 2 00:11:46.469 } 00:11:46.469 ], 00:11:46.469 "driver_specific": { 00:11:46.469 "raid": { 00:11:46.469 "uuid": "703edfff-31e2-48a9-a785-0b1106565922", 00:11:46.469 "strip_size_kb": 64, 00:11:46.469 "state": "online", 00:11:46.469 "raid_level": "raid0", 00:11:46.469 "superblock": true, 00:11:46.469 "num_base_bdevs": 2, 00:11:46.469 "num_base_bdevs_discovered": 2, 00:11:46.469 "num_base_bdevs_operational": 2, 00:11:46.469 "base_bdevs_list": [ 00:11:46.469 { 00:11:46.469 "name": "pt1", 00:11:46.469 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:46.469 "is_configured": true, 00:11:46.469 "data_offset": 2048, 00:11:46.469 "data_size": 63488 00:11:46.469 }, 00:11:46.469 { 00:11:46.469 "name": "pt2", 00:11:46.469 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:46.469 "is_configured": true, 00:11:46.469 "data_offset": 2048, 00:11:46.469 "data_size": 63488 00:11:46.469 } 00:11:46.469 ] 00:11:46.469 } 00:11:46.469 } 00:11:46.469 }' 00:11:46.469 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:46.469 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:46.469 pt2' 00:11:46.469 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:46.469 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:46.469 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:46.727 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:46.727 "name": "pt1", 00:11:46.727 "aliases": [ 00:11:46.727 "00000000-0000-0000-0000-000000000001" 00:11:46.727 ], 00:11:46.727 "product_name": "passthru", 00:11:46.727 "block_size": 512, 00:11:46.727 "num_blocks": 65536, 00:11:46.727 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:46.727 "assigned_rate_limits": { 00:11:46.727 "rw_ios_per_sec": 0, 00:11:46.727 "rw_mbytes_per_sec": 0, 00:11:46.728 "r_mbytes_per_sec": 0, 00:11:46.728 "w_mbytes_per_sec": 0 00:11:46.728 }, 00:11:46.728 "claimed": true, 00:11:46.728 "claim_type": "exclusive_write", 00:11:46.728 "zoned": false, 00:11:46.728 "supported_io_types": { 00:11:46.728 "read": true, 00:11:46.728 "write": true, 00:11:46.728 "unmap": true, 00:11:46.728 "flush": true, 00:11:46.728 "reset": true, 00:11:46.728 "nvme_admin": false, 00:11:46.728 "nvme_io": false, 00:11:46.728 "nvme_io_md": false, 00:11:46.728 "write_zeroes": true, 00:11:46.728 "zcopy": true, 00:11:46.728 "get_zone_info": false, 00:11:46.728 "zone_management": false, 00:11:46.728 "zone_append": false, 00:11:46.728 "compare": false, 00:11:46.728 "compare_and_write": false, 00:11:46.728 "abort": true, 00:11:46.728 "seek_hole": false, 00:11:46.728 "seek_data": false, 00:11:46.728 "copy": true, 00:11:46.728 "nvme_iov_md": false 00:11:46.728 }, 00:11:46.728 "memory_domains": [ 00:11:46.728 { 00:11:46.728 "dma_device_id": "system", 00:11:46.728 "dma_device_type": 1 00:11:46.728 }, 00:11:46.728 { 00:11:46.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.728 "dma_device_type": 2 00:11:46.728 } 00:11:46.728 ], 00:11:46.728 "driver_specific": { 00:11:46.728 "passthru": { 00:11:46.728 "name": "pt1", 00:11:46.728 "base_bdev_name": "malloc1" 00:11:46.728 } 00:11:46.728 } 00:11:46.728 }' 00:11:46.728 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:46.728 20:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:46.728 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:46.728 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:46.728 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:46.728 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:46.728 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:46.986 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:46.986 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:46.986 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:46.986 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:46.986 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:46.986 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:46.986 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:46.986 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:47.245 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:47.245 "name": "pt2", 00:11:47.245 "aliases": [ 00:11:47.245 "00000000-0000-0000-0000-000000000002" 00:11:47.245 ], 00:11:47.245 "product_name": "passthru", 00:11:47.245 "block_size": 512, 00:11:47.245 "num_blocks": 65536, 00:11:47.245 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:47.245 "assigned_rate_limits": { 00:11:47.245 "rw_ios_per_sec": 0, 00:11:47.245 "rw_mbytes_per_sec": 0, 00:11:47.245 "r_mbytes_per_sec": 0, 00:11:47.245 "w_mbytes_per_sec": 0 00:11:47.245 }, 00:11:47.245 "claimed": true, 00:11:47.245 "claim_type": "exclusive_write", 00:11:47.245 "zoned": false, 00:11:47.245 "supported_io_types": { 00:11:47.245 "read": true, 00:11:47.245 "write": true, 00:11:47.245 "unmap": true, 00:11:47.245 "flush": true, 00:11:47.245 "reset": true, 00:11:47.245 "nvme_admin": false, 00:11:47.245 "nvme_io": false, 00:11:47.245 "nvme_io_md": false, 00:11:47.245 "write_zeroes": true, 00:11:47.245 "zcopy": true, 00:11:47.245 "get_zone_info": false, 00:11:47.245 "zone_management": false, 00:11:47.245 "zone_append": false, 00:11:47.245 "compare": false, 00:11:47.245 "compare_and_write": false, 00:11:47.245 "abort": true, 00:11:47.245 "seek_hole": false, 00:11:47.245 "seek_data": false, 00:11:47.245 "copy": true, 00:11:47.245 "nvme_iov_md": false 00:11:47.245 }, 00:11:47.245 "memory_domains": [ 00:11:47.245 { 00:11:47.245 "dma_device_id": "system", 00:11:47.245 "dma_device_type": 1 00:11:47.245 }, 00:11:47.245 { 00:11:47.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.245 "dma_device_type": 2 00:11:47.245 } 00:11:47.245 ], 00:11:47.245 "driver_specific": { 00:11:47.245 "passthru": { 00:11:47.245 "name": "pt2", 00:11:47.245 "base_bdev_name": "malloc2" 00:11:47.245 } 00:11:47.245 } 00:11:47.245 }' 00:11:47.245 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:47.245 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:47.245 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:47.245 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:47.503 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:47.503 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:47.503 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:47.503 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:47.503 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:47.503 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:47.503 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:47.503 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:47.503 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:47.503 20:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:47.761 [2024-07-15 20:25:40.087103] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:47.761 20:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 703edfff-31e2-48a9-a785-0b1106565922 '!=' 703edfff-31e2-48a9-a785-0b1106565922 ']' 00:11:47.761 20:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:11:47.761 20:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:47.761 20:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:47.761 20:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1351068 00:11:47.761 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1351068 ']' 00:11:47.761 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1351068 00:11:47.761 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:47.761 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:47.761 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1351068 00:11:48.020 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:48.020 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:48.020 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1351068' 00:11:48.020 killing process with pid 1351068 00:11:48.020 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1351068 00:11:48.020 [2024-07-15 20:25:40.158797] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:48.020 [2024-07-15 20:25:40.158858] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:48.020 [2024-07-15 20:25:40.158911] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:48.020 [2024-07-15 20:25:40.158931] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c38ec0 name raid_bdev1, state offline 00:11:48.020 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1351068 00:11:48.020 [2024-07-15 20:25:40.193418] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:48.280 20:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:48.280 00:11:48.280 real 0m13.764s 00:11:48.280 user 0m24.605s 00:11:48.280 sys 0m2.348s 00:11:48.280 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:48.280 20:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:48.280 ************************************ 00:11:48.280 END TEST raid_superblock_test 00:11:48.280 ************************************ 00:11:48.280 20:25:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:48.280 20:25:40 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:11:48.280 20:25:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:48.280 20:25:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:48.280 20:25:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:48.571 ************************************ 00:11:48.571 START TEST raid_read_error_test 00:11:48.571 ************************************ 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.qxOZhStZAo 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1353151 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1353151 /var/tmp/spdk-raid.sock 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1353151 ']' 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:48.571 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:48.571 20:25:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:48.571 [2024-07-15 20:25:40.759961] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:11:48.571 [2024-07-15 20:25:40.760032] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1353151 ] 00:11:48.571 [2024-07-15 20:25:40.892458] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:48.837 [2024-07-15 20:25:40.996021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:48.837 [2024-07-15 20:25:41.060036] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:48.837 [2024-07-15 20:25:41.060078] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:49.400 20:25:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:49.400 20:25:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:49.400 20:25:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:49.401 20:25:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:49.658 BaseBdev1_malloc 00:11:49.658 20:25:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:49.915 true 00:11:49.915 20:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:50.173 [2024-07-15 20:25:42.298244] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:50.173 [2024-07-15 20:25:42.298292] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:50.173 [2024-07-15 20:25:42.298312] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a060d0 00:11:50.173 [2024-07-15 20:25:42.298325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:50.173 [2024-07-15 20:25:42.300029] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:50.173 [2024-07-15 20:25:42.300059] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:50.173 BaseBdev1 00:11:50.173 20:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:50.173 20:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:50.173 BaseBdev2_malloc 00:11:50.173 20:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:50.430 true 00:11:50.430 20:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:50.687 [2024-07-15 20:25:42.828199] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:50.687 [2024-07-15 20:25:42.828241] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:50.688 [2024-07-15 20:25:42.828262] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a0a910 00:11:50.688 [2024-07-15 20:25:42.828280] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:50.688 [2024-07-15 20:25:42.829768] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:50.688 [2024-07-15 20:25:42.829799] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:50.688 BaseBdev2 00:11:50.688 20:25:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:50.688 [2024-07-15 20:25:43.000681] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:50.688 [2024-07-15 20:25:43.001867] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:50.688 [2024-07-15 20:25:43.002062] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a0c320 00:11:50.688 [2024-07-15 20:25:43.002076] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:50.688 [2024-07-15 20:25:43.002254] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a0b270 00:11:50.688 [2024-07-15 20:25:43.002395] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a0c320 00:11:50.688 [2024-07-15 20:25:43.002405] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a0c320 00:11:50.688 [2024-07-15 20:25:43.002502] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.688 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:50.945 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.945 "name": "raid_bdev1", 00:11:50.945 "uuid": "f0cfeb2a-bf84-48e5-8dd6-a39a9a5b8858", 00:11:50.945 "strip_size_kb": 64, 00:11:50.945 "state": "online", 00:11:50.945 "raid_level": "raid0", 00:11:50.945 "superblock": true, 00:11:50.945 "num_base_bdevs": 2, 00:11:50.945 "num_base_bdevs_discovered": 2, 00:11:50.945 "num_base_bdevs_operational": 2, 00:11:50.945 "base_bdevs_list": [ 00:11:50.945 { 00:11:50.945 "name": "BaseBdev1", 00:11:50.945 "uuid": "6df0a2cb-ec55-5099-973e-e4d4c55bf10e", 00:11:50.945 "is_configured": true, 00:11:50.945 "data_offset": 2048, 00:11:50.945 "data_size": 63488 00:11:50.945 }, 00:11:50.945 { 00:11:50.945 "name": "BaseBdev2", 00:11:50.945 "uuid": "6285d41b-6f65-5c71-92d1-57c2e4696a3c", 00:11:50.945 "is_configured": true, 00:11:50.945 "data_offset": 2048, 00:11:50.945 "data_size": 63488 00:11:50.945 } 00:11:50.945 ] 00:11:50.945 }' 00:11:50.945 20:25:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.945 20:25:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:51.874 20:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:51.874 20:25:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:51.874 [2024-07-15 20:25:44.252296] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a079b0 00:11:52.821 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.081 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:53.341 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.341 "name": "raid_bdev1", 00:11:53.341 "uuid": "f0cfeb2a-bf84-48e5-8dd6-a39a9a5b8858", 00:11:53.341 "strip_size_kb": 64, 00:11:53.341 "state": "online", 00:11:53.341 "raid_level": "raid0", 00:11:53.341 "superblock": true, 00:11:53.341 "num_base_bdevs": 2, 00:11:53.341 "num_base_bdevs_discovered": 2, 00:11:53.341 "num_base_bdevs_operational": 2, 00:11:53.341 "base_bdevs_list": [ 00:11:53.341 { 00:11:53.341 "name": "BaseBdev1", 00:11:53.341 "uuid": "6df0a2cb-ec55-5099-973e-e4d4c55bf10e", 00:11:53.341 "is_configured": true, 00:11:53.341 "data_offset": 2048, 00:11:53.341 "data_size": 63488 00:11:53.341 }, 00:11:53.341 { 00:11:53.341 "name": "BaseBdev2", 00:11:53.341 "uuid": "6285d41b-6f65-5c71-92d1-57c2e4696a3c", 00:11:53.341 "is_configured": true, 00:11:53.341 "data_offset": 2048, 00:11:53.341 "data_size": 63488 00:11:53.341 } 00:11:53.341 ] 00:11:53.341 }' 00:11:53.341 20:25:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.341 20:25:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:53.908 20:25:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:54.166 [2024-07-15 20:25:46.392082] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:54.166 [2024-07-15 20:25:46.392146] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:54.166 [2024-07-15 20:25:46.395327] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:54.166 [2024-07-15 20:25:46.395361] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:54.166 [2024-07-15 20:25:46.395388] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:54.166 [2024-07-15 20:25:46.395400] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a0c320 name raid_bdev1, state offline 00:11:54.166 0 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1353151 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1353151 ']' 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1353151 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1353151 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1353151' 00:11:54.166 killing process with pid 1353151 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1353151 00:11:54.166 [2024-07-15 20:25:46.473357] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:54.166 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1353151 00:11:54.166 [2024-07-15 20:25:46.484357] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:54.424 20:25:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.qxOZhStZAo 00:11:54.424 20:25:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:54.424 20:25:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:54.424 20:25:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:11:54.424 20:25:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:54.424 20:25:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:54.424 20:25:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:54.424 20:25:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:11:54.424 00:11:54.424 real 0m6.041s 00:11:54.424 user 0m9.436s 00:11:54.424 sys 0m1.078s 00:11:54.424 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:54.424 20:25:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.424 ************************************ 00:11:54.424 END TEST raid_read_error_test 00:11:54.424 ************************************ 00:11:54.424 20:25:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:54.424 20:25:46 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:11:54.424 20:25:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:54.424 20:25:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:54.424 20:25:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:54.683 ************************************ 00:11:54.683 START TEST raid_write_error_test 00:11:54.683 ************************************ 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Zy3pm1Wxf5 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1353962 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1353962 /var/tmp/spdk-raid.sock 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1353962 ']' 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:54.683 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:54.683 20:25:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.683 [2024-07-15 20:25:46.893119] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:11:54.683 [2024-07-15 20:25:46.893190] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1353962 ] 00:11:54.683 [2024-07-15 20:25:47.022660] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:54.941 [2024-07-15 20:25:47.132050] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:54.941 [2024-07-15 20:25:47.198664] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:54.941 [2024-07-15 20:25:47.198706] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:55.508 20:25:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:55.508 20:25:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:55.508 20:25:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:55.508 20:25:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:55.766 BaseBdev1_malloc 00:11:55.767 20:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:56.025 true 00:11:56.025 20:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:56.283 [2024-07-15 20:25:48.543009] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:56.283 [2024-07-15 20:25:48.543057] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:56.283 [2024-07-15 20:25:48.543080] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e5a0d0 00:11:56.283 [2024-07-15 20:25:48.543093] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:56.283 [2024-07-15 20:25:48.545004] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:56.283 [2024-07-15 20:25:48.545036] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:56.283 BaseBdev1 00:11:56.283 20:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:56.283 20:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:56.542 BaseBdev2_malloc 00:11:56.542 20:25:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:56.800 true 00:11:56.800 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:57.058 [2024-07-15 20:25:49.278866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:57.058 [2024-07-15 20:25:49.278911] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:57.058 [2024-07-15 20:25:49.278938] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e5e910 00:11:57.058 [2024-07-15 20:25:49.278952] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:57.058 [2024-07-15 20:25:49.280504] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:57.058 [2024-07-15 20:25:49.280535] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:57.058 BaseBdev2 00:11:57.058 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:57.316 [2024-07-15 20:25:49.519538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:57.316 [2024-07-15 20:25:49.520910] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:57.316 [2024-07-15 20:25:49.521113] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e60320 00:11:57.316 [2024-07-15 20:25:49.521127] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:57.316 [2024-07-15 20:25:49.521330] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e5f270 00:11:57.316 [2024-07-15 20:25:49.521477] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e60320 00:11:57.316 [2024-07-15 20:25:49.521487] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e60320 00:11:57.316 [2024-07-15 20:25:49.521595] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.316 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:57.574 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.574 "name": "raid_bdev1", 00:11:57.574 "uuid": "d3861650-0635-42d2-b6ef-509e301b86ef", 00:11:57.574 "strip_size_kb": 64, 00:11:57.574 "state": "online", 00:11:57.574 "raid_level": "raid0", 00:11:57.574 "superblock": true, 00:11:57.574 "num_base_bdevs": 2, 00:11:57.574 "num_base_bdevs_discovered": 2, 00:11:57.574 "num_base_bdevs_operational": 2, 00:11:57.574 "base_bdevs_list": [ 00:11:57.574 { 00:11:57.574 "name": "BaseBdev1", 00:11:57.574 "uuid": "13ba2ed6-42f9-5c0b-8f30-7e2431370416", 00:11:57.574 "is_configured": true, 00:11:57.574 "data_offset": 2048, 00:11:57.574 "data_size": 63488 00:11:57.574 }, 00:11:57.574 { 00:11:57.574 "name": "BaseBdev2", 00:11:57.574 "uuid": "301f741d-1347-50c7-9c45-c595e15f0fa6", 00:11:57.574 "is_configured": true, 00:11:57.574 "data_offset": 2048, 00:11:57.574 "data_size": 63488 00:11:57.574 } 00:11:57.574 ] 00:11:57.574 }' 00:11:57.574 20:25:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.574 20:25:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:58.140 20:25:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:58.140 20:25:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:58.140 [2024-07-15 20:25:50.498409] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e5b9b0 00:11:59.074 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:59.332 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:59.332 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:59.332 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:59.332 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:59.332 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:59.332 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:59.332 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:59.332 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:59.332 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:59.332 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:59.333 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:59.333 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:59.333 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:59.333 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.333 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:59.591 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:59.591 "name": "raid_bdev1", 00:11:59.591 "uuid": "d3861650-0635-42d2-b6ef-509e301b86ef", 00:11:59.591 "strip_size_kb": 64, 00:11:59.591 "state": "online", 00:11:59.591 "raid_level": "raid0", 00:11:59.591 "superblock": true, 00:11:59.591 "num_base_bdevs": 2, 00:11:59.591 "num_base_bdevs_discovered": 2, 00:11:59.591 "num_base_bdevs_operational": 2, 00:11:59.591 "base_bdevs_list": [ 00:11:59.591 { 00:11:59.591 "name": "BaseBdev1", 00:11:59.591 "uuid": "13ba2ed6-42f9-5c0b-8f30-7e2431370416", 00:11:59.591 "is_configured": true, 00:11:59.591 "data_offset": 2048, 00:11:59.591 "data_size": 63488 00:11:59.591 }, 00:11:59.591 { 00:11:59.591 "name": "BaseBdev2", 00:11:59.591 "uuid": "301f741d-1347-50c7-9c45-c595e15f0fa6", 00:11:59.591 "is_configured": true, 00:11:59.591 "data_offset": 2048, 00:11:59.591 "data_size": 63488 00:11:59.591 } 00:11:59.591 ] 00:11:59.591 }' 00:11:59.591 20:25:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:59.591 20:25:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:00.526 20:25:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:01.092 [2024-07-15 20:25:53.286298] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:01.092 [2024-07-15 20:25:53.286344] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:01.092 [2024-07-15 20:25:53.289516] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:01.092 [2024-07-15 20:25:53.289545] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:01.092 [2024-07-15 20:25:53.289573] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:01.092 [2024-07-15 20:25:53.289585] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e60320 name raid_bdev1, state offline 00:12:01.092 0 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1353962 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1353962 ']' 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1353962 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1353962 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1353962' 00:12:01.092 killing process with pid 1353962 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1353962 00:12:01.092 [2024-07-15 20:25:53.389737] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:01.092 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1353962 00:12:01.092 [2024-07-15 20:25:53.401954] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:01.351 20:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Zy3pm1Wxf5 00:12:01.351 20:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:01.351 20:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:01.351 20:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.36 00:12:01.351 20:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:01.351 20:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:01.351 20:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:01.351 20:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.36 != \0\.\0\0 ]] 00:12:01.351 00:12:01.351 real 0m6.818s 00:12:01.351 user 0m10.887s 00:12:01.351 sys 0m1.139s 00:12:01.351 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:01.351 20:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.351 ************************************ 00:12:01.351 END TEST raid_write_error_test 00:12:01.351 ************************************ 00:12:01.351 20:25:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:01.351 20:25:53 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:01.351 20:25:53 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:12:01.351 20:25:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:01.351 20:25:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:01.351 20:25:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:01.351 ************************************ 00:12:01.351 START TEST raid_state_function_test 00:12:01.351 ************************************ 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1354945 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1354945' 00:12:01.351 Process raid pid: 1354945 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1354945 /var/tmp/spdk-raid.sock 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1354945 ']' 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:01.351 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:01.351 20:25:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.610 [2024-07-15 20:25:53.785011] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:12:01.610 [2024-07-15 20:25:53.785076] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:01.610 [2024-07-15 20:25:53.903868] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:01.868 [2024-07-15 20:25:54.010244] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:01.868 [2024-07-15 20:25:54.077754] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:01.868 [2024-07-15 20:25:54.077787] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:01.868 20:25:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:01.868 20:25:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:01.868 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:02.434 [2024-07-15 20:25:54.728869] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:02.434 [2024-07-15 20:25:54.728912] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:02.434 [2024-07-15 20:25:54.728924] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:02.434 [2024-07-15 20:25:54.728944] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.434 20:25:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:03.033 20:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:03.033 "name": "Existed_Raid", 00:12:03.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:03.033 "strip_size_kb": 64, 00:12:03.033 "state": "configuring", 00:12:03.033 "raid_level": "concat", 00:12:03.033 "superblock": false, 00:12:03.033 "num_base_bdevs": 2, 00:12:03.033 "num_base_bdevs_discovered": 0, 00:12:03.033 "num_base_bdevs_operational": 2, 00:12:03.033 "base_bdevs_list": [ 00:12:03.033 { 00:12:03.033 "name": "BaseBdev1", 00:12:03.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:03.033 "is_configured": false, 00:12:03.033 "data_offset": 0, 00:12:03.033 "data_size": 0 00:12:03.033 }, 00:12:03.033 { 00:12:03.033 "name": "BaseBdev2", 00:12:03.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:03.033 "is_configured": false, 00:12:03.033 "data_offset": 0, 00:12:03.033 "data_size": 0 00:12:03.033 } 00:12:03.033 ] 00:12:03.033 }' 00:12:03.033 20:25:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:03.033 20:25:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.967 20:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:04.225 [2024-07-15 20:25:56.361011] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:04.225 [2024-07-15 20:25:56.361042] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x915a80 name Existed_Raid, state configuring 00:12:04.225 20:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:04.483 [2024-07-15 20:25:56.609690] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:04.483 [2024-07-15 20:25:56.609722] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:04.483 [2024-07-15 20:25:56.609733] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:04.483 [2024-07-15 20:25:56.609744] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:04.483 20:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:04.741 [2024-07-15 20:25:56.864219] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:04.741 BaseBdev1 00:12:04.741 20:25:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:04.741 20:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:04.741 20:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:04.741 20:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:04.741 20:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:04.741 20:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:04.741 20:25:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:04.999 20:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:04.999 [ 00:12:04.999 { 00:12:04.999 "name": "BaseBdev1", 00:12:04.999 "aliases": [ 00:12:04.999 "8821f0dd-d7d4-4590-8d4c-cd8b56cc18a3" 00:12:04.999 ], 00:12:04.999 "product_name": "Malloc disk", 00:12:04.999 "block_size": 512, 00:12:04.999 "num_blocks": 65536, 00:12:04.999 "uuid": "8821f0dd-d7d4-4590-8d4c-cd8b56cc18a3", 00:12:04.999 "assigned_rate_limits": { 00:12:04.999 "rw_ios_per_sec": 0, 00:12:04.999 "rw_mbytes_per_sec": 0, 00:12:04.999 "r_mbytes_per_sec": 0, 00:12:04.999 "w_mbytes_per_sec": 0 00:12:04.999 }, 00:12:04.999 "claimed": true, 00:12:04.999 "claim_type": "exclusive_write", 00:12:04.999 "zoned": false, 00:12:04.999 "supported_io_types": { 00:12:04.999 "read": true, 00:12:04.999 "write": true, 00:12:04.999 "unmap": true, 00:12:04.999 "flush": true, 00:12:04.999 "reset": true, 00:12:04.999 "nvme_admin": false, 00:12:04.999 "nvme_io": false, 00:12:04.999 "nvme_io_md": false, 00:12:04.999 "write_zeroes": true, 00:12:04.999 "zcopy": true, 00:12:04.999 "get_zone_info": false, 00:12:04.999 "zone_management": false, 00:12:04.999 "zone_append": false, 00:12:04.999 "compare": false, 00:12:04.999 "compare_and_write": false, 00:12:04.999 "abort": true, 00:12:04.999 "seek_hole": false, 00:12:04.999 "seek_data": false, 00:12:04.999 "copy": true, 00:12:04.999 "nvme_iov_md": false 00:12:04.999 }, 00:12:04.999 "memory_domains": [ 00:12:04.999 { 00:12:04.999 "dma_device_id": "system", 00:12:04.999 "dma_device_type": 1 00:12:04.999 }, 00:12:04.999 { 00:12:04.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.999 "dma_device_type": 2 00:12:04.999 } 00:12:04.999 ], 00:12:04.999 "driver_specific": {} 00:12:04.999 } 00:12:04.999 ] 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.258 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:05.517 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:05.517 "name": "Existed_Raid", 00:12:05.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.517 "strip_size_kb": 64, 00:12:05.517 "state": "configuring", 00:12:05.517 "raid_level": "concat", 00:12:05.517 "superblock": false, 00:12:05.517 "num_base_bdevs": 2, 00:12:05.517 "num_base_bdevs_discovered": 1, 00:12:05.517 "num_base_bdevs_operational": 2, 00:12:05.517 "base_bdevs_list": [ 00:12:05.517 { 00:12:05.517 "name": "BaseBdev1", 00:12:05.517 "uuid": "8821f0dd-d7d4-4590-8d4c-cd8b56cc18a3", 00:12:05.517 "is_configured": true, 00:12:05.517 "data_offset": 0, 00:12:05.517 "data_size": 65536 00:12:05.517 }, 00:12:05.517 { 00:12:05.517 "name": "BaseBdev2", 00:12:05.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.517 "is_configured": false, 00:12:05.517 "data_offset": 0, 00:12:05.517 "data_size": 0 00:12:05.517 } 00:12:05.517 ] 00:12:05.517 }' 00:12:05.517 20:25:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:05.517 20:25:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:06.450 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:06.450 [2024-07-15 20:25:58.725139] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:06.450 [2024-07-15 20:25:58.725178] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x915350 name Existed_Raid, state configuring 00:12:06.450 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:06.708 [2024-07-15 20:25:58.957788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:06.708 [2024-07-15 20:25:58.959325] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:06.708 [2024-07-15 20:25:58.959358] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:06.708 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:06.708 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:06.708 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:06.708 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:06.708 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:06.709 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:06.709 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:06.709 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:06.709 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:06.709 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:06.709 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:06.709 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:06.709 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.709 20:25:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:06.967 20:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:06.967 "name": "Existed_Raid", 00:12:06.967 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:06.967 "strip_size_kb": 64, 00:12:06.967 "state": "configuring", 00:12:06.967 "raid_level": "concat", 00:12:06.967 "superblock": false, 00:12:06.967 "num_base_bdevs": 2, 00:12:06.967 "num_base_bdevs_discovered": 1, 00:12:06.967 "num_base_bdevs_operational": 2, 00:12:06.967 "base_bdevs_list": [ 00:12:06.967 { 00:12:06.967 "name": "BaseBdev1", 00:12:06.967 "uuid": "8821f0dd-d7d4-4590-8d4c-cd8b56cc18a3", 00:12:06.967 "is_configured": true, 00:12:06.967 "data_offset": 0, 00:12:06.967 "data_size": 65536 00:12:06.967 }, 00:12:06.967 { 00:12:06.967 "name": "BaseBdev2", 00:12:06.967 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:06.967 "is_configured": false, 00:12:06.967 "data_offset": 0, 00:12:06.967 "data_size": 0 00:12:06.967 } 00:12:06.967 ] 00:12:06.967 }' 00:12:06.967 20:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:06.967 20:25:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.532 20:25:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:07.789 [2024-07-15 20:26:00.048491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:07.789 [2024-07-15 20:26:00.048528] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x916000 00:12:07.789 [2024-07-15 20:26:00.048536] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:07.789 [2024-07-15 20:26:00.048730] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8300c0 00:12:07.789 [2024-07-15 20:26:00.048849] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x916000 00:12:07.789 [2024-07-15 20:26:00.048859] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x916000 00:12:07.789 [2024-07-15 20:26:00.049031] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:07.789 BaseBdev2 00:12:07.789 20:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:07.790 20:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:07.790 20:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:07.790 20:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:07.790 20:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:07.790 20:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:07.790 20:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:08.355 20:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:08.921 [ 00:12:08.921 { 00:12:08.921 "name": "BaseBdev2", 00:12:08.921 "aliases": [ 00:12:08.921 "32ffa022-4890-4d4b-979c-1502e4c34955" 00:12:08.922 ], 00:12:08.922 "product_name": "Malloc disk", 00:12:08.922 "block_size": 512, 00:12:08.922 "num_blocks": 65536, 00:12:08.922 "uuid": "32ffa022-4890-4d4b-979c-1502e4c34955", 00:12:08.922 "assigned_rate_limits": { 00:12:08.922 "rw_ios_per_sec": 0, 00:12:08.922 "rw_mbytes_per_sec": 0, 00:12:08.922 "r_mbytes_per_sec": 0, 00:12:08.922 "w_mbytes_per_sec": 0 00:12:08.922 }, 00:12:08.922 "claimed": true, 00:12:08.922 "claim_type": "exclusive_write", 00:12:08.922 "zoned": false, 00:12:08.922 "supported_io_types": { 00:12:08.922 "read": true, 00:12:08.922 "write": true, 00:12:08.922 "unmap": true, 00:12:08.922 "flush": true, 00:12:08.922 "reset": true, 00:12:08.922 "nvme_admin": false, 00:12:08.922 "nvme_io": false, 00:12:08.922 "nvme_io_md": false, 00:12:08.922 "write_zeroes": true, 00:12:08.922 "zcopy": true, 00:12:08.922 "get_zone_info": false, 00:12:08.922 "zone_management": false, 00:12:08.922 "zone_append": false, 00:12:08.922 "compare": false, 00:12:08.922 "compare_and_write": false, 00:12:08.922 "abort": true, 00:12:08.922 "seek_hole": false, 00:12:08.922 "seek_data": false, 00:12:08.922 "copy": true, 00:12:08.922 "nvme_iov_md": false 00:12:08.922 }, 00:12:08.922 "memory_domains": [ 00:12:08.922 { 00:12:08.922 "dma_device_id": "system", 00:12:08.922 "dma_device_type": 1 00:12:08.922 }, 00:12:08.922 { 00:12:08.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.922 "dma_device_type": 2 00:12:08.922 } 00:12:08.922 ], 00:12:08.922 "driver_specific": {} 00:12:08.922 } 00:12:08.922 ] 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.922 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:09.180 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.180 "name": "Existed_Raid", 00:12:09.180 "uuid": "a53775fa-89c7-4680-b104-ef91446e123f", 00:12:09.180 "strip_size_kb": 64, 00:12:09.180 "state": "online", 00:12:09.180 "raid_level": "concat", 00:12:09.180 "superblock": false, 00:12:09.180 "num_base_bdevs": 2, 00:12:09.180 "num_base_bdevs_discovered": 2, 00:12:09.180 "num_base_bdevs_operational": 2, 00:12:09.180 "base_bdevs_list": [ 00:12:09.180 { 00:12:09.180 "name": "BaseBdev1", 00:12:09.180 "uuid": "8821f0dd-d7d4-4590-8d4c-cd8b56cc18a3", 00:12:09.180 "is_configured": true, 00:12:09.180 "data_offset": 0, 00:12:09.180 "data_size": 65536 00:12:09.180 }, 00:12:09.180 { 00:12:09.180 "name": "BaseBdev2", 00:12:09.180 "uuid": "32ffa022-4890-4d4b-979c-1502e4c34955", 00:12:09.180 "is_configured": true, 00:12:09.180 "data_offset": 0, 00:12:09.180 "data_size": 65536 00:12:09.180 } 00:12:09.180 ] 00:12:09.180 }' 00:12:09.180 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.180 20:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:09.745 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:09.745 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:09.745 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:09.745 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:09.745 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:09.745 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:09.745 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:09.745 20:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:10.002 [2024-07-15 20:26:02.126283] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:10.002 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:10.002 "name": "Existed_Raid", 00:12:10.002 "aliases": [ 00:12:10.002 "a53775fa-89c7-4680-b104-ef91446e123f" 00:12:10.002 ], 00:12:10.002 "product_name": "Raid Volume", 00:12:10.002 "block_size": 512, 00:12:10.002 "num_blocks": 131072, 00:12:10.002 "uuid": "a53775fa-89c7-4680-b104-ef91446e123f", 00:12:10.002 "assigned_rate_limits": { 00:12:10.002 "rw_ios_per_sec": 0, 00:12:10.002 "rw_mbytes_per_sec": 0, 00:12:10.002 "r_mbytes_per_sec": 0, 00:12:10.002 "w_mbytes_per_sec": 0 00:12:10.002 }, 00:12:10.002 "claimed": false, 00:12:10.002 "zoned": false, 00:12:10.002 "supported_io_types": { 00:12:10.002 "read": true, 00:12:10.002 "write": true, 00:12:10.002 "unmap": true, 00:12:10.002 "flush": true, 00:12:10.002 "reset": true, 00:12:10.002 "nvme_admin": false, 00:12:10.002 "nvme_io": false, 00:12:10.002 "nvme_io_md": false, 00:12:10.002 "write_zeroes": true, 00:12:10.002 "zcopy": false, 00:12:10.002 "get_zone_info": false, 00:12:10.002 "zone_management": false, 00:12:10.002 "zone_append": false, 00:12:10.002 "compare": false, 00:12:10.002 "compare_and_write": false, 00:12:10.002 "abort": false, 00:12:10.002 "seek_hole": false, 00:12:10.002 "seek_data": false, 00:12:10.002 "copy": false, 00:12:10.002 "nvme_iov_md": false 00:12:10.002 }, 00:12:10.002 "memory_domains": [ 00:12:10.002 { 00:12:10.002 "dma_device_id": "system", 00:12:10.002 "dma_device_type": 1 00:12:10.003 }, 00:12:10.003 { 00:12:10.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.003 "dma_device_type": 2 00:12:10.003 }, 00:12:10.003 { 00:12:10.003 "dma_device_id": "system", 00:12:10.003 "dma_device_type": 1 00:12:10.003 }, 00:12:10.003 { 00:12:10.003 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.003 "dma_device_type": 2 00:12:10.003 } 00:12:10.003 ], 00:12:10.003 "driver_specific": { 00:12:10.003 "raid": { 00:12:10.003 "uuid": "a53775fa-89c7-4680-b104-ef91446e123f", 00:12:10.003 "strip_size_kb": 64, 00:12:10.003 "state": "online", 00:12:10.003 "raid_level": "concat", 00:12:10.003 "superblock": false, 00:12:10.003 "num_base_bdevs": 2, 00:12:10.003 "num_base_bdevs_discovered": 2, 00:12:10.003 "num_base_bdevs_operational": 2, 00:12:10.003 "base_bdevs_list": [ 00:12:10.003 { 00:12:10.003 "name": "BaseBdev1", 00:12:10.003 "uuid": "8821f0dd-d7d4-4590-8d4c-cd8b56cc18a3", 00:12:10.003 "is_configured": true, 00:12:10.003 "data_offset": 0, 00:12:10.003 "data_size": 65536 00:12:10.003 }, 00:12:10.003 { 00:12:10.003 "name": "BaseBdev2", 00:12:10.003 "uuid": "32ffa022-4890-4d4b-979c-1502e4c34955", 00:12:10.003 "is_configured": true, 00:12:10.003 "data_offset": 0, 00:12:10.003 "data_size": 65536 00:12:10.003 } 00:12:10.003 ] 00:12:10.003 } 00:12:10.003 } 00:12:10.003 }' 00:12:10.003 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:10.003 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:10.003 BaseBdev2' 00:12:10.003 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:10.003 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:10.003 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:10.260 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:10.260 "name": "BaseBdev1", 00:12:10.260 "aliases": [ 00:12:10.260 "8821f0dd-d7d4-4590-8d4c-cd8b56cc18a3" 00:12:10.260 ], 00:12:10.260 "product_name": "Malloc disk", 00:12:10.260 "block_size": 512, 00:12:10.260 "num_blocks": 65536, 00:12:10.260 "uuid": "8821f0dd-d7d4-4590-8d4c-cd8b56cc18a3", 00:12:10.260 "assigned_rate_limits": { 00:12:10.260 "rw_ios_per_sec": 0, 00:12:10.260 "rw_mbytes_per_sec": 0, 00:12:10.261 "r_mbytes_per_sec": 0, 00:12:10.261 "w_mbytes_per_sec": 0 00:12:10.261 }, 00:12:10.261 "claimed": true, 00:12:10.261 "claim_type": "exclusive_write", 00:12:10.261 "zoned": false, 00:12:10.261 "supported_io_types": { 00:12:10.261 "read": true, 00:12:10.261 "write": true, 00:12:10.261 "unmap": true, 00:12:10.261 "flush": true, 00:12:10.261 "reset": true, 00:12:10.261 "nvme_admin": false, 00:12:10.261 "nvme_io": false, 00:12:10.261 "nvme_io_md": false, 00:12:10.261 "write_zeroes": true, 00:12:10.261 "zcopy": true, 00:12:10.261 "get_zone_info": false, 00:12:10.261 "zone_management": false, 00:12:10.261 "zone_append": false, 00:12:10.261 "compare": false, 00:12:10.261 "compare_and_write": false, 00:12:10.261 "abort": true, 00:12:10.261 "seek_hole": false, 00:12:10.261 "seek_data": false, 00:12:10.261 "copy": true, 00:12:10.261 "nvme_iov_md": false 00:12:10.261 }, 00:12:10.261 "memory_domains": [ 00:12:10.261 { 00:12:10.261 "dma_device_id": "system", 00:12:10.261 "dma_device_type": 1 00:12:10.261 }, 00:12:10.261 { 00:12:10.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.261 "dma_device_type": 2 00:12:10.261 } 00:12:10.261 ], 00:12:10.261 "driver_specific": {} 00:12:10.261 }' 00:12:10.261 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.261 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.261 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:10.261 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:10.261 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:10.261 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:10.261 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:10.518 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:10.518 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:10.518 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:10.518 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:10.518 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:10.518 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:10.518 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:10.518 20:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:10.776 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:10.776 "name": "BaseBdev2", 00:12:10.776 "aliases": [ 00:12:10.776 "32ffa022-4890-4d4b-979c-1502e4c34955" 00:12:10.776 ], 00:12:10.776 "product_name": "Malloc disk", 00:12:10.776 "block_size": 512, 00:12:10.776 "num_blocks": 65536, 00:12:10.776 "uuid": "32ffa022-4890-4d4b-979c-1502e4c34955", 00:12:10.776 "assigned_rate_limits": { 00:12:10.776 "rw_ios_per_sec": 0, 00:12:10.776 "rw_mbytes_per_sec": 0, 00:12:10.776 "r_mbytes_per_sec": 0, 00:12:10.776 "w_mbytes_per_sec": 0 00:12:10.776 }, 00:12:10.776 "claimed": true, 00:12:10.776 "claim_type": "exclusive_write", 00:12:10.776 "zoned": false, 00:12:10.776 "supported_io_types": { 00:12:10.776 "read": true, 00:12:10.776 "write": true, 00:12:10.777 "unmap": true, 00:12:10.777 "flush": true, 00:12:10.777 "reset": true, 00:12:10.777 "nvme_admin": false, 00:12:10.777 "nvme_io": false, 00:12:10.777 "nvme_io_md": false, 00:12:10.777 "write_zeroes": true, 00:12:10.777 "zcopy": true, 00:12:10.777 "get_zone_info": false, 00:12:10.777 "zone_management": false, 00:12:10.777 "zone_append": false, 00:12:10.777 "compare": false, 00:12:10.777 "compare_and_write": false, 00:12:10.777 "abort": true, 00:12:10.777 "seek_hole": false, 00:12:10.777 "seek_data": false, 00:12:10.777 "copy": true, 00:12:10.777 "nvme_iov_md": false 00:12:10.777 }, 00:12:10.777 "memory_domains": [ 00:12:10.777 { 00:12:10.777 "dma_device_id": "system", 00:12:10.777 "dma_device_type": 1 00:12:10.777 }, 00:12:10.777 { 00:12:10.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.777 "dma_device_type": 2 00:12:10.777 } 00:12:10.777 ], 00:12:10.777 "driver_specific": {} 00:12:10.777 }' 00:12:10.777 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.777 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:10.777 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:10.777 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.035 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:11.035 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:11.035 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.035 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:11.035 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:11.035 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.035 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:11.035 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:11.035 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:11.294 [2024-07-15 20:26:03.622022] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:11.294 [2024-07-15 20:26:03.622050] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:11.294 [2024-07-15 20:26:03.622091] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.294 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:11.552 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:11.552 "name": "Existed_Raid", 00:12:11.552 "uuid": "a53775fa-89c7-4680-b104-ef91446e123f", 00:12:11.552 "strip_size_kb": 64, 00:12:11.552 "state": "offline", 00:12:11.552 "raid_level": "concat", 00:12:11.552 "superblock": false, 00:12:11.552 "num_base_bdevs": 2, 00:12:11.552 "num_base_bdevs_discovered": 1, 00:12:11.552 "num_base_bdevs_operational": 1, 00:12:11.552 "base_bdevs_list": [ 00:12:11.552 { 00:12:11.552 "name": null, 00:12:11.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:11.552 "is_configured": false, 00:12:11.552 "data_offset": 0, 00:12:11.552 "data_size": 65536 00:12:11.552 }, 00:12:11.552 { 00:12:11.552 "name": "BaseBdev2", 00:12:11.552 "uuid": "32ffa022-4890-4d4b-979c-1502e4c34955", 00:12:11.552 "is_configured": true, 00:12:11.552 "data_offset": 0, 00:12:11.552 "data_size": 65536 00:12:11.552 } 00:12:11.552 ] 00:12:11.552 }' 00:12:11.552 20:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:11.552 20:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.117 20:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:12.117 20:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:12.117 20:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.117 20:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:12.374 20:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:12.374 20:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:12.374 20:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:12.941 [2024-07-15 20:26:05.216185] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:12.941 [2024-07-15 20:26:05.216235] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x916000 name Existed_Raid, state offline 00:12:12.941 20:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:12.941 20:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:12.941 20:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.941 20:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1354945 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1354945 ']' 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1354945 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1354945 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1354945' 00:12:13.198 killing process with pid 1354945 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1354945 00:12:13.198 [2024-07-15 20:26:05.555534] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:13.198 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1354945 00:12:13.198 [2024-07-15 20:26:05.556503] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:13.456 20:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:13.456 00:12:13.456 real 0m12.066s 00:12:13.456 user 0m21.945s 00:12:13.456 sys 0m2.246s 00:12:13.456 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:13.456 20:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.456 ************************************ 00:12:13.456 END TEST raid_state_function_test 00:12:13.456 ************************************ 00:12:13.456 20:26:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:13.456 20:26:05 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:12:13.456 20:26:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:13.456 20:26:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:13.456 20:26:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:13.715 ************************************ 00:12:13.715 START TEST raid_state_function_test_sb 00:12:13.715 ************************************ 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1356747 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1356747' 00:12:13.715 Process raid pid: 1356747 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1356747 /var/tmp/spdk-raid.sock 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1356747 ']' 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:13.715 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:13.715 20:26:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:13.715 [2024-07-15 20:26:05.926897] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:12:13.715 [2024-07-15 20:26:05.926967] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:13.715 [2024-07-15 20:26:06.053596] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:13.974 [2024-07-15 20:26:06.157318] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:13.974 [2024-07-15 20:26:06.221667] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:13.974 [2024-07-15 20:26:06.221704] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:14.540 20:26:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:14.540 20:26:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:14.540 20:26:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:14.799 [2024-07-15 20:26:07.087618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:14.799 [2024-07-15 20:26:07.087662] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:14.799 [2024-07-15 20:26:07.087673] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:14.799 [2024-07-15 20:26:07.087686] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.799 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:15.058 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.058 "name": "Existed_Raid", 00:12:15.058 "uuid": "abc5bd81-19ae-412f-bedd-295e439b491d", 00:12:15.058 "strip_size_kb": 64, 00:12:15.058 "state": "configuring", 00:12:15.058 "raid_level": "concat", 00:12:15.058 "superblock": true, 00:12:15.058 "num_base_bdevs": 2, 00:12:15.058 "num_base_bdevs_discovered": 0, 00:12:15.058 "num_base_bdevs_operational": 2, 00:12:15.058 "base_bdevs_list": [ 00:12:15.058 { 00:12:15.058 "name": "BaseBdev1", 00:12:15.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:15.058 "is_configured": false, 00:12:15.058 "data_offset": 0, 00:12:15.058 "data_size": 0 00:12:15.058 }, 00:12:15.058 { 00:12:15.058 "name": "BaseBdev2", 00:12:15.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:15.058 "is_configured": false, 00:12:15.058 "data_offset": 0, 00:12:15.058 "data_size": 0 00:12:15.058 } 00:12:15.058 ] 00:12:15.058 }' 00:12:15.058 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.058 20:26:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:15.624 20:26:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:15.882 [2024-07-15 20:26:08.162318] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:15.882 [2024-07-15 20:26:08.162346] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe2fa80 name Existed_Raid, state configuring 00:12:15.882 20:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:16.140 [2024-07-15 20:26:08.410997] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:16.140 [2024-07-15 20:26:08.411023] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:16.140 [2024-07-15 20:26:08.411033] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:16.140 [2024-07-15 20:26:08.411045] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:16.140 20:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:16.397 [2024-07-15 20:26:08.661511] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:16.397 BaseBdev1 00:12:16.397 20:26:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:16.397 20:26:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:16.397 20:26:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:16.397 20:26:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:16.397 20:26:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:16.397 20:26:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:16.397 20:26:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:16.655 20:26:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:16.913 [ 00:12:16.913 { 00:12:16.913 "name": "BaseBdev1", 00:12:16.913 "aliases": [ 00:12:16.913 "b88f268d-1628-41b3-b886-d4be32a70bc3" 00:12:16.913 ], 00:12:16.913 "product_name": "Malloc disk", 00:12:16.913 "block_size": 512, 00:12:16.913 "num_blocks": 65536, 00:12:16.913 "uuid": "b88f268d-1628-41b3-b886-d4be32a70bc3", 00:12:16.913 "assigned_rate_limits": { 00:12:16.913 "rw_ios_per_sec": 0, 00:12:16.913 "rw_mbytes_per_sec": 0, 00:12:16.913 "r_mbytes_per_sec": 0, 00:12:16.913 "w_mbytes_per_sec": 0 00:12:16.913 }, 00:12:16.913 "claimed": true, 00:12:16.913 "claim_type": "exclusive_write", 00:12:16.913 "zoned": false, 00:12:16.913 "supported_io_types": { 00:12:16.913 "read": true, 00:12:16.913 "write": true, 00:12:16.913 "unmap": true, 00:12:16.913 "flush": true, 00:12:16.913 "reset": true, 00:12:16.913 "nvme_admin": false, 00:12:16.913 "nvme_io": false, 00:12:16.913 "nvme_io_md": false, 00:12:16.913 "write_zeroes": true, 00:12:16.913 "zcopy": true, 00:12:16.913 "get_zone_info": false, 00:12:16.913 "zone_management": false, 00:12:16.913 "zone_append": false, 00:12:16.913 "compare": false, 00:12:16.913 "compare_and_write": false, 00:12:16.913 "abort": true, 00:12:16.913 "seek_hole": false, 00:12:16.913 "seek_data": false, 00:12:16.913 "copy": true, 00:12:16.913 "nvme_iov_md": false 00:12:16.913 }, 00:12:16.913 "memory_domains": [ 00:12:16.913 { 00:12:16.913 "dma_device_id": "system", 00:12:16.913 "dma_device_type": 1 00:12:16.913 }, 00:12:16.913 { 00:12:16.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.913 "dma_device_type": 2 00:12:16.913 } 00:12:16.913 ], 00:12:16.913 "driver_specific": {} 00:12:16.913 } 00:12:16.913 ] 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:16.913 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:16.914 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.914 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:17.172 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.172 "name": "Existed_Raid", 00:12:17.172 "uuid": "38f10827-cc2f-4a6c-acc6-53e55fa8f2fb", 00:12:17.172 "strip_size_kb": 64, 00:12:17.172 "state": "configuring", 00:12:17.172 "raid_level": "concat", 00:12:17.172 "superblock": true, 00:12:17.172 "num_base_bdevs": 2, 00:12:17.172 "num_base_bdevs_discovered": 1, 00:12:17.172 "num_base_bdevs_operational": 2, 00:12:17.172 "base_bdevs_list": [ 00:12:17.172 { 00:12:17.172 "name": "BaseBdev1", 00:12:17.172 "uuid": "b88f268d-1628-41b3-b886-d4be32a70bc3", 00:12:17.172 "is_configured": true, 00:12:17.172 "data_offset": 2048, 00:12:17.172 "data_size": 63488 00:12:17.172 }, 00:12:17.172 { 00:12:17.172 "name": "BaseBdev2", 00:12:17.172 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:17.172 "is_configured": false, 00:12:17.172 "data_offset": 0, 00:12:17.172 "data_size": 0 00:12:17.172 } 00:12:17.172 ] 00:12:17.172 }' 00:12:17.172 20:26:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.172 20:26:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:17.772 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:18.030 [2024-07-15 20:26:10.257763] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:18.030 [2024-07-15 20:26:10.257804] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe2f350 name Existed_Raid, state configuring 00:12:18.030 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:18.289 [2024-07-15 20:26:10.506454] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:18.289 [2024-07-15 20:26:10.507947] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:18.289 [2024-07-15 20:26:10.507984] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.289 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:18.548 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.548 "name": "Existed_Raid", 00:12:18.548 "uuid": "763228bd-3688-4957-a04c-7af9149c3b2a", 00:12:18.548 "strip_size_kb": 64, 00:12:18.548 "state": "configuring", 00:12:18.548 "raid_level": "concat", 00:12:18.548 "superblock": true, 00:12:18.548 "num_base_bdevs": 2, 00:12:18.548 "num_base_bdevs_discovered": 1, 00:12:18.548 "num_base_bdevs_operational": 2, 00:12:18.548 "base_bdevs_list": [ 00:12:18.548 { 00:12:18.548 "name": "BaseBdev1", 00:12:18.548 "uuid": "b88f268d-1628-41b3-b886-d4be32a70bc3", 00:12:18.548 "is_configured": true, 00:12:18.548 "data_offset": 2048, 00:12:18.548 "data_size": 63488 00:12:18.548 }, 00:12:18.548 { 00:12:18.548 "name": "BaseBdev2", 00:12:18.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.548 "is_configured": false, 00:12:18.548 "data_offset": 0, 00:12:18.548 "data_size": 0 00:12:18.548 } 00:12:18.548 ] 00:12:18.548 }' 00:12:18.548 20:26:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.548 20:26:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:19.115 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:19.115 [2024-07-15 20:26:11.480287] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:19.115 [2024-07-15 20:26:11.480431] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe30000 00:12:19.115 [2024-07-15 20:26:11.480444] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:19.115 [2024-07-15 20:26:11.480612] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd4a0c0 00:12:19.115 [2024-07-15 20:26:11.480726] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe30000 00:12:19.115 [2024-07-15 20:26:11.480736] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe30000 00:12:19.115 [2024-07-15 20:26:11.480824] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:19.115 BaseBdev2 00:12:19.373 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:19.373 20:26:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:19.373 20:26:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:19.373 20:26:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:19.373 20:26:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:19.373 20:26:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:19.373 20:26:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:19.373 20:26:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:19.631 [ 00:12:19.632 { 00:12:19.632 "name": "BaseBdev2", 00:12:19.632 "aliases": [ 00:12:19.632 "35e4cded-b5dc-4c3a-9913-6a3284e710b8" 00:12:19.632 ], 00:12:19.632 "product_name": "Malloc disk", 00:12:19.632 "block_size": 512, 00:12:19.632 "num_blocks": 65536, 00:12:19.632 "uuid": "35e4cded-b5dc-4c3a-9913-6a3284e710b8", 00:12:19.632 "assigned_rate_limits": { 00:12:19.632 "rw_ios_per_sec": 0, 00:12:19.632 "rw_mbytes_per_sec": 0, 00:12:19.632 "r_mbytes_per_sec": 0, 00:12:19.632 "w_mbytes_per_sec": 0 00:12:19.632 }, 00:12:19.632 "claimed": true, 00:12:19.632 "claim_type": "exclusive_write", 00:12:19.632 "zoned": false, 00:12:19.632 "supported_io_types": { 00:12:19.632 "read": true, 00:12:19.632 "write": true, 00:12:19.632 "unmap": true, 00:12:19.632 "flush": true, 00:12:19.632 "reset": true, 00:12:19.632 "nvme_admin": false, 00:12:19.632 "nvme_io": false, 00:12:19.632 "nvme_io_md": false, 00:12:19.632 "write_zeroes": true, 00:12:19.632 "zcopy": true, 00:12:19.632 "get_zone_info": false, 00:12:19.632 "zone_management": false, 00:12:19.632 "zone_append": false, 00:12:19.632 "compare": false, 00:12:19.632 "compare_and_write": false, 00:12:19.632 "abort": true, 00:12:19.632 "seek_hole": false, 00:12:19.632 "seek_data": false, 00:12:19.632 "copy": true, 00:12:19.632 "nvme_iov_md": false 00:12:19.632 }, 00:12:19.632 "memory_domains": [ 00:12:19.632 { 00:12:19.632 "dma_device_id": "system", 00:12:19.632 "dma_device_type": 1 00:12:19.632 }, 00:12:19.632 { 00:12:19.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.632 "dma_device_type": 2 00:12:19.632 } 00:12:19.632 ], 00:12:19.632 "driver_specific": {} 00:12:19.632 } 00:12:19.632 ] 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.632 20:26:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:19.890 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:19.890 "name": "Existed_Raid", 00:12:19.890 "uuid": "763228bd-3688-4957-a04c-7af9149c3b2a", 00:12:19.890 "strip_size_kb": 64, 00:12:19.890 "state": "online", 00:12:19.890 "raid_level": "concat", 00:12:19.890 "superblock": true, 00:12:19.890 "num_base_bdevs": 2, 00:12:19.890 "num_base_bdevs_discovered": 2, 00:12:19.890 "num_base_bdevs_operational": 2, 00:12:19.890 "base_bdevs_list": [ 00:12:19.890 { 00:12:19.890 "name": "BaseBdev1", 00:12:19.890 "uuid": "b88f268d-1628-41b3-b886-d4be32a70bc3", 00:12:19.890 "is_configured": true, 00:12:19.890 "data_offset": 2048, 00:12:19.890 "data_size": 63488 00:12:19.890 }, 00:12:19.890 { 00:12:19.890 "name": "BaseBdev2", 00:12:19.890 "uuid": "35e4cded-b5dc-4c3a-9913-6a3284e710b8", 00:12:19.890 "is_configured": true, 00:12:19.890 "data_offset": 2048, 00:12:19.890 "data_size": 63488 00:12:19.890 } 00:12:19.890 ] 00:12:19.890 }' 00:12:19.890 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:19.890 20:26:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:20.455 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:20.455 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:20.455 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:20.455 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:20.455 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:20.455 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:20.455 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:20.455 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:20.713 [2024-07-15 20:26:12.916362] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:20.713 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:20.713 "name": "Existed_Raid", 00:12:20.713 "aliases": [ 00:12:20.713 "763228bd-3688-4957-a04c-7af9149c3b2a" 00:12:20.713 ], 00:12:20.713 "product_name": "Raid Volume", 00:12:20.713 "block_size": 512, 00:12:20.713 "num_blocks": 126976, 00:12:20.713 "uuid": "763228bd-3688-4957-a04c-7af9149c3b2a", 00:12:20.713 "assigned_rate_limits": { 00:12:20.713 "rw_ios_per_sec": 0, 00:12:20.713 "rw_mbytes_per_sec": 0, 00:12:20.713 "r_mbytes_per_sec": 0, 00:12:20.713 "w_mbytes_per_sec": 0 00:12:20.713 }, 00:12:20.713 "claimed": false, 00:12:20.713 "zoned": false, 00:12:20.713 "supported_io_types": { 00:12:20.713 "read": true, 00:12:20.713 "write": true, 00:12:20.713 "unmap": true, 00:12:20.713 "flush": true, 00:12:20.713 "reset": true, 00:12:20.713 "nvme_admin": false, 00:12:20.713 "nvme_io": false, 00:12:20.713 "nvme_io_md": false, 00:12:20.713 "write_zeroes": true, 00:12:20.713 "zcopy": false, 00:12:20.713 "get_zone_info": false, 00:12:20.713 "zone_management": false, 00:12:20.713 "zone_append": false, 00:12:20.713 "compare": false, 00:12:20.713 "compare_and_write": false, 00:12:20.713 "abort": false, 00:12:20.713 "seek_hole": false, 00:12:20.713 "seek_data": false, 00:12:20.713 "copy": false, 00:12:20.713 "nvme_iov_md": false 00:12:20.713 }, 00:12:20.713 "memory_domains": [ 00:12:20.713 { 00:12:20.713 "dma_device_id": "system", 00:12:20.713 "dma_device_type": 1 00:12:20.713 }, 00:12:20.713 { 00:12:20.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.713 "dma_device_type": 2 00:12:20.713 }, 00:12:20.713 { 00:12:20.713 "dma_device_id": "system", 00:12:20.713 "dma_device_type": 1 00:12:20.713 }, 00:12:20.713 { 00:12:20.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.714 "dma_device_type": 2 00:12:20.714 } 00:12:20.714 ], 00:12:20.714 "driver_specific": { 00:12:20.714 "raid": { 00:12:20.714 "uuid": "763228bd-3688-4957-a04c-7af9149c3b2a", 00:12:20.714 "strip_size_kb": 64, 00:12:20.714 "state": "online", 00:12:20.714 "raid_level": "concat", 00:12:20.714 "superblock": true, 00:12:20.714 "num_base_bdevs": 2, 00:12:20.714 "num_base_bdevs_discovered": 2, 00:12:20.714 "num_base_bdevs_operational": 2, 00:12:20.714 "base_bdevs_list": [ 00:12:20.714 { 00:12:20.714 "name": "BaseBdev1", 00:12:20.714 "uuid": "b88f268d-1628-41b3-b886-d4be32a70bc3", 00:12:20.714 "is_configured": true, 00:12:20.714 "data_offset": 2048, 00:12:20.714 "data_size": 63488 00:12:20.714 }, 00:12:20.714 { 00:12:20.714 "name": "BaseBdev2", 00:12:20.714 "uuid": "35e4cded-b5dc-4c3a-9913-6a3284e710b8", 00:12:20.714 "is_configured": true, 00:12:20.714 "data_offset": 2048, 00:12:20.714 "data_size": 63488 00:12:20.714 } 00:12:20.714 ] 00:12:20.714 } 00:12:20.714 } 00:12:20.714 }' 00:12:20.714 20:26:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:20.714 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:20.714 BaseBdev2' 00:12:20.714 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:20.714 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:20.714 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:20.972 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:20.972 "name": "BaseBdev1", 00:12:20.972 "aliases": [ 00:12:20.972 "b88f268d-1628-41b3-b886-d4be32a70bc3" 00:12:20.972 ], 00:12:20.972 "product_name": "Malloc disk", 00:12:20.972 "block_size": 512, 00:12:20.972 "num_blocks": 65536, 00:12:20.972 "uuid": "b88f268d-1628-41b3-b886-d4be32a70bc3", 00:12:20.972 "assigned_rate_limits": { 00:12:20.972 "rw_ios_per_sec": 0, 00:12:20.972 "rw_mbytes_per_sec": 0, 00:12:20.972 "r_mbytes_per_sec": 0, 00:12:20.972 "w_mbytes_per_sec": 0 00:12:20.972 }, 00:12:20.972 "claimed": true, 00:12:20.972 "claim_type": "exclusive_write", 00:12:20.972 "zoned": false, 00:12:20.972 "supported_io_types": { 00:12:20.972 "read": true, 00:12:20.972 "write": true, 00:12:20.972 "unmap": true, 00:12:20.972 "flush": true, 00:12:20.972 "reset": true, 00:12:20.972 "nvme_admin": false, 00:12:20.972 "nvme_io": false, 00:12:20.972 "nvme_io_md": false, 00:12:20.972 "write_zeroes": true, 00:12:20.972 "zcopy": true, 00:12:20.972 "get_zone_info": false, 00:12:20.972 "zone_management": false, 00:12:20.972 "zone_append": false, 00:12:20.972 "compare": false, 00:12:20.972 "compare_and_write": false, 00:12:20.972 "abort": true, 00:12:20.972 "seek_hole": false, 00:12:20.972 "seek_data": false, 00:12:20.972 "copy": true, 00:12:20.972 "nvme_iov_md": false 00:12:20.972 }, 00:12:20.972 "memory_domains": [ 00:12:20.972 { 00:12:20.972 "dma_device_id": "system", 00:12:20.972 "dma_device_type": 1 00:12:20.972 }, 00:12:20.972 { 00:12:20.972 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.972 "dma_device_type": 2 00:12:20.972 } 00:12:20.972 ], 00:12:20.972 "driver_specific": {} 00:12:20.972 }' 00:12:20.972 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:20.972 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.229 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:21.229 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.229 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.229 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:21.229 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.229 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.229 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:21.230 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.230 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.488 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:21.488 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:21.488 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:21.488 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:21.744 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:21.744 "name": "BaseBdev2", 00:12:21.744 "aliases": [ 00:12:21.744 "35e4cded-b5dc-4c3a-9913-6a3284e710b8" 00:12:21.744 ], 00:12:21.744 "product_name": "Malloc disk", 00:12:21.744 "block_size": 512, 00:12:21.744 "num_blocks": 65536, 00:12:21.744 "uuid": "35e4cded-b5dc-4c3a-9913-6a3284e710b8", 00:12:21.744 "assigned_rate_limits": { 00:12:21.744 "rw_ios_per_sec": 0, 00:12:21.744 "rw_mbytes_per_sec": 0, 00:12:21.744 "r_mbytes_per_sec": 0, 00:12:21.744 "w_mbytes_per_sec": 0 00:12:21.744 }, 00:12:21.744 "claimed": true, 00:12:21.744 "claim_type": "exclusive_write", 00:12:21.744 "zoned": false, 00:12:21.744 "supported_io_types": { 00:12:21.744 "read": true, 00:12:21.744 "write": true, 00:12:21.744 "unmap": true, 00:12:21.744 "flush": true, 00:12:21.744 "reset": true, 00:12:21.744 "nvme_admin": false, 00:12:21.745 "nvme_io": false, 00:12:21.745 "nvme_io_md": false, 00:12:21.745 "write_zeroes": true, 00:12:21.745 "zcopy": true, 00:12:21.745 "get_zone_info": false, 00:12:21.745 "zone_management": false, 00:12:21.745 "zone_append": false, 00:12:21.745 "compare": false, 00:12:21.745 "compare_and_write": false, 00:12:21.745 "abort": true, 00:12:21.745 "seek_hole": false, 00:12:21.745 "seek_data": false, 00:12:21.745 "copy": true, 00:12:21.745 "nvme_iov_md": false 00:12:21.745 }, 00:12:21.745 "memory_domains": [ 00:12:21.745 { 00:12:21.745 "dma_device_id": "system", 00:12:21.745 "dma_device_type": 1 00:12:21.745 }, 00:12:21.745 { 00:12:21.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.745 "dma_device_type": 2 00:12:21.745 } 00:12:21.745 ], 00:12:21.745 "driver_specific": {} 00:12:21.745 }' 00:12:21.745 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.745 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.745 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:21.745 20:26:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.745 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.745 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:21.745 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.745 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:22.001 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:22.001 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:22.001 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:22.001 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:22.001 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:22.259 [2024-07-15 20:26:14.448216] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:22.259 [2024-07-15 20:26:14.448244] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:22.259 [2024-07-15 20:26:14.448286] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.259 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.516 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.516 "name": "Existed_Raid", 00:12:22.516 "uuid": "763228bd-3688-4957-a04c-7af9149c3b2a", 00:12:22.516 "strip_size_kb": 64, 00:12:22.516 "state": "offline", 00:12:22.516 "raid_level": "concat", 00:12:22.516 "superblock": true, 00:12:22.516 "num_base_bdevs": 2, 00:12:22.516 "num_base_bdevs_discovered": 1, 00:12:22.516 "num_base_bdevs_operational": 1, 00:12:22.516 "base_bdevs_list": [ 00:12:22.516 { 00:12:22.516 "name": null, 00:12:22.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.516 "is_configured": false, 00:12:22.516 "data_offset": 2048, 00:12:22.516 "data_size": 63488 00:12:22.516 }, 00:12:22.516 { 00:12:22.516 "name": "BaseBdev2", 00:12:22.516 "uuid": "35e4cded-b5dc-4c3a-9913-6a3284e710b8", 00:12:22.516 "is_configured": true, 00:12:22.516 "data_offset": 2048, 00:12:22.516 "data_size": 63488 00:12:22.516 } 00:12:22.516 ] 00:12:22.516 }' 00:12:22.516 20:26:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.516 20:26:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:23.080 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:23.080 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:23.080 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.080 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:23.338 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:23.338 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:23.338 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:23.596 [2024-07-15 20:26:15.829782] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:23.596 [2024-07-15 20:26:15.829831] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe30000 name Existed_Raid, state offline 00:12:23.596 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:23.596 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:23.596 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.596 20:26:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1356747 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1356747 ']' 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1356747 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1356747 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1356747' 00:12:23.854 killing process with pid 1356747 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1356747 00:12:23.854 [2024-07-15 20:26:16.163436] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:23.854 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1356747 00:12:23.854 [2024-07-15 20:26:16.164415] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:24.113 20:26:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:24.113 00:12:24.113 real 0m10.522s 00:12:24.113 user 0m18.664s 00:12:24.113 sys 0m1.980s 00:12:24.113 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:24.113 20:26:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:24.113 ************************************ 00:12:24.113 END TEST raid_state_function_test_sb 00:12:24.113 ************************************ 00:12:24.113 20:26:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:24.113 20:26:16 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:12:24.113 20:26:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:24.113 20:26:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:24.113 20:26:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:24.113 ************************************ 00:12:24.113 START TEST raid_superblock_test 00:12:24.113 ************************************ 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1358381 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1358381 /var/tmp/spdk-raid.sock 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1358381 ']' 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:24.113 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:24.113 20:26:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.371 [2024-07-15 20:26:16.530552] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:12:24.371 [2024-07-15 20:26:16.530620] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1358381 ] 00:12:24.371 [2024-07-15 20:26:16.660638] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:24.630 [2024-07-15 20:26:16.763139] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:24.630 [2024-07-15 20:26:16.817259] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.630 [2024-07-15 20:26:16.817287] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:25.195 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:25.453 malloc1 00:12:25.453 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:25.710 [2024-07-15 20:26:17.936968] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:25.710 [2024-07-15 20:26:17.937016] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:25.710 [2024-07-15 20:26:17.937038] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1331570 00:12:25.710 [2024-07-15 20:26:17.937051] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:25.710 [2024-07-15 20:26:17.938766] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:25.710 [2024-07-15 20:26:17.938796] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:25.710 pt1 00:12:25.710 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:25.710 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:25.710 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:25.710 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:25.710 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:25.710 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:25.710 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:25.710 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:25.710 20:26:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:25.967 malloc2 00:12:25.967 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:26.224 [2024-07-15 20:26:18.427432] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:26.224 [2024-07-15 20:26:18.427477] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:26.224 [2024-07-15 20:26:18.427494] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1332970 00:12:26.224 [2024-07-15 20:26:18.427507] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:26.224 [2024-07-15 20:26:18.429104] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:26.224 [2024-07-15 20:26:18.429134] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:26.224 pt2 00:12:26.224 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:26.224 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:26.224 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:12:26.481 [2024-07-15 20:26:18.672109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:26.481 [2024-07-15 20:26:18.673474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:26.481 [2024-07-15 20:26:18.673626] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14d5270 00:12:26.481 [2024-07-15 20:26:18.673639] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:26.481 [2024-07-15 20:26:18.673845] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14cac10 00:12:26.481 [2024-07-15 20:26:18.674003] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14d5270 00:12:26.481 [2024-07-15 20:26:18.674014] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14d5270 00:12:26.481 [2024-07-15 20:26:18.674122] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.481 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:26.738 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.738 "name": "raid_bdev1", 00:12:26.738 "uuid": "8de465bc-6976-4c0b-a43f-a0b97b2d75ab", 00:12:26.738 "strip_size_kb": 64, 00:12:26.738 "state": "online", 00:12:26.738 "raid_level": "concat", 00:12:26.738 "superblock": true, 00:12:26.738 "num_base_bdevs": 2, 00:12:26.738 "num_base_bdevs_discovered": 2, 00:12:26.738 "num_base_bdevs_operational": 2, 00:12:26.738 "base_bdevs_list": [ 00:12:26.738 { 00:12:26.738 "name": "pt1", 00:12:26.738 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:26.738 "is_configured": true, 00:12:26.738 "data_offset": 2048, 00:12:26.738 "data_size": 63488 00:12:26.738 }, 00:12:26.738 { 00:12:26.738 "name": "pt2", 00:12:26.738 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:26.738 "is_configured": true, 00:12:26.738 "data_offset": 2048, 00:12:26.738 "data_size": 63488 00:12:26.738 } 00:12:26.738 ] 00:12:26.738 }' 00:12:26.738 20:26:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.738 20:26:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.302 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:27.302 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:27.302 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:27.302 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:27.302 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:27.302 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:27.302 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:27.302 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:27.302 [2024-07-15 20:26:19.666968] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:27.560 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:27.560 "name": "raid_bdev1", 00:12:27.560 "aliases": [ 00:12:27.560 "8de465bc-6976-4c0b-a43f-a0b97b2d75ab" 00:12:27.560 ], 00:12:27.560 "product_name": "Raid Volume", 00:12:27.560 "block_size": 512, 00:12:27.560 "num_blocks": 126976, 00:12:27.560 "uuid": "8de465bc-6976-4c0b-a43f-a0b97b2d75ab", 00:12:27.560 "assigned_rate_limits": { 00:12:27.560 "rw_ios_per_sec": 0, 00:12:27.560 "rw_mbytes_per_sec": 0, 00:12:27.560 "r_mbytes_per_sec": 0, 00:12:27.560 "w_mbytes_per_sec": 0 00:12:27.560 }, 00:12:27.560 "claimed": false, 00:12:27.560 "zoned": false, 00:12:27.560 "supported_io_types": { 00:12:27.560 "read": true, 00:12:27.560 "write": true, 00:12:27.560 "unmap": true, 00:12:27.560 "flush": true, 00:12:27.560 "reset": true, 00:12:27.560 "nvme_admin": false, 00:12:27.560 "nvme_io": false, 00:12:27.560 "nvme_io_md": false, 00:12:27.560 "write_zeroes": true, 00:12:27.560 "zcopy": false, 00:12:27.560 "get_zone_info": false, 00:12:27.560 "zone_management": false, 00:12:27.560 "zone_append": false, 00:12:27.560 "compare": false, 00:12:27.560 "compare_and_write": false, 00:12:27.560 "abort": false, 00:12:27.560 "seek_hole": false, 00:12:27.560 "seek_data": false, 00:12:27.560 "copy": false, 00:12:27.560 "nvme_iov_md": false 00:12:27.560 }, 00:12:27.560 "memory_domains": [ 00:12:27.560 { 00:12:27.560 "dma_device_id": "system", 00:12:27.560 "dma_device_type": 1 00:12:27.560 }, 00:12:27.560 { 00:12:27.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.560 "dma_device_type": 2 00:12:27.560 }, 00:12:27.560 { 00:12:27.560 "dma_device_id": "system", 00:12:27.560 "dma_device_type": 1 00:12:27.560 }, 00:12:27.560 { 00:12:27.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.560 "dma_device_type": 2 00:12:27.560 } 00:12:27.560 ], 00:12:27.560 "driver_specific": { 00:12:27.560 "raid": { 00:12:27.560 "uuid": "8de465bc-6976-4c0b-a43f-a0b97b2d75ab", 00:12:27.560 "strip_size_kb": 64, 00:12:27.560 "state": "online", 00:12:27.560 "raid_level": "concat", 00:12:27.560 "superblock": true, 00:12:27.560 "num_base_bdevs": 2, 00:12:27.560 "num_base_bdevs_discovered": 2, 00:12:27.560 "num_base_bdevs_operational": 2, 00:12:27.560 "base_bdevs_list": [ 00:12:27.560 { 00:12:27.560 "name": "pt1", 00:12:27.560 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:27.560 "is_configured": true, 00:12:27.560 "data_offset": 2048, 00:12:27.560 "data_size": 63488 00:12:27.560 }, 00:12:27.560 { 00:12:27.560 "name": "pt2", 00:12:27.560 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:27.560 "is_configured": true, 00:12:27.560 "data_offset": 2048, 00:12:27.560 "data_size": 63488 00:12:27.560 } 00:12:27.560 ] 00:12:27.560 } 00:12:27.560 } 00:12:27.560 }' 00:12:27.560 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:27.560 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:27.560 pt2' 00:12:27.560 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:27.560 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:27.560 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:27.818 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:27.818 "name": "pt1", 00:12:27.818 "aliases": [ 00:12:27.818 "00000000-0000-0000-0000-000000000001" 00:12:27.818 ], 00:12:27.818 "product_name": "passthru", 00:12:27.818 "block_size": 512, 00:12:27.818 "num_blocks": 65536, 00:12:27.818 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:27.818 "assigned_rate_limits": { 00:12:27.818 "rw_ios_per_sec": 0, 00:12:27.818 "rw_mbytes_per_sec": 0, 00:12:27.818 "r_mbytes_per_sec": 0, 00:12:27.818 "w_mbytes_per_sec": 0 00:12:27.818 }, 00:12:27.818 "claimed": true, 00:12:27.818 "claim_type": "exclusive_write", 00:12:27.818 "zoned": false, 00:12:27.818 "supported_io_types": { 00:12:27.818 "read": true, 00:12:27.818 "write": true, 00:12:27.818 "unmap": true, 00:12:27.818 "flush": true, 00:12:27.818 "reset": true, 00:12:27.818 "nvme_admin": false, 00:12:27.818 "nvme_io": false, 00:12:27.818 "nvme_io_md": false, 00:12:27.818 "write_zeroes": true, 00:12:27.818 "zcopy": true, 00:12:27.818 "get_zone_info": false, 00:12:27.818 "zone_management": false, 00:12:27.818 "zone_append": false, 00:12:27.818 "compare": false, 00:12:27.818 "compare_and_write": false, 00:12:27.818 "abort": true, 00:12:27.818 "seek_hole": false, 00:12:27.818 "seek_data": false, 00:12:27.818 "copy": true, 00:12:27.818 "nvme_iov_md": false 00:12:27.818 }, 00:12:27.818 "memory_domains": [ 00:12:27.818 { 00:12:27.818 "dma_device_id": "system", 00:12:27.818 "dma_device_type": 1 00:12:27.818 }, 00:12:27.818 { 00:12:27.818 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.818 "dma_device_type": 2 00:12:27.818 } 00:12:27.818 ], 00:12:27.818 "driver_specific": { 00:12:27.818 "passthru": { 00:12:27.818 "name": "pt1", 00:12:27.818 "base_bdev_name": "malloc1" 00:12:27.818 } 00:12:27.818 } 00:12:27.818 }' 00:12:27.818 20:26:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:27.818 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:27.818 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:27.818 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:27.818 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:28.076 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:28.076 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:28.076 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:28.076 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:28.076 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:28.076 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:28.076 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:28.076 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:28.076 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:28.076 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:28.335 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:28.335 "name": "pt2", 00:12:28.335 "aliases": [ 00:12:28.335 "00000000-0000-0000-0000-000000000002" 00:12:28.335 ], 00:12:28.335 "product_name": "passthru", 00:12:28.335 "block_size": 512, 00:12:28.335 "num_blocks": 65536, 00:12:28.335 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:28.335 "assigned_rate_limits": { 00:12:28.335 "rw_ios_per_sec": 0, 00:12:28.335 "rw_mbytes_per_sec": 0, 00:12:28.335 "r_mbytes_per_sec": 0, 00:12:28.335 "w_mbytes_per_sec": 0 00:12:28.335 }, 00:12:28.335 "claimed": true, 00:12:28.335 "claim_type": "exclusive_write", 00:12:28.335 "zoned": false, 00:12:28.335 "supported_io_types": { 00:12:28.335 "read": true, 00:12:28.335 "write": true, 00:12:28.335 "unmap": true, 00:12:28.335 "flush": true, 00:12:28.335 "reset": true, 00:12:28.335 "nvme_admin": false, 00:12:28.335 "nvme_io": false, 00:12:28.335 "nvme_io_md": false, 00:12:28.335 "write_zeroes": true, 00:12:28.335 "zcopy": true, 00:12:28.335 "get_zone_info": false, 00:12:28.335 "zone_management": false, 00:12:28.335 "zone_append": false, 00:12:28.335 "compare": false, 00:12:28.335 "compare_and_write": false, 00:12:28.335 "abort": true, 00:12:28.335 "seek_hole": false, 00:12:28.335 "seek_data": false, 00:12:28.335 "copy": true, 00:12:28.335 "nvme_iov_md": false 00:12:28.335 }, 00:12:28.335 "memory_domains": [ 00:12:28.335 { 00:12:28.335 "dma_device_id": "system", 00:12:28.335 "dma_device_type": 1 00:12:28.335 }, 00:12:28.335 { 00:12:28.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:28.335 "dma_device_type": 2 00:12:28.335 } 00:12:28.335 ], 00:12:28.335 "driver_specific": { 00:12:28.335 "passthru": { 00:12:28.335 "name": "pt2", 00:12:28.335 "base_bdev_name": "malloc2" 00:12:28.335 } 00:12:28.335 } 00:12:28.335 }' 00:12:28.335 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:28.335 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:28.335 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:28.593 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:28.593 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:28.593 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:28.593 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:28.593 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:28.593 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:28.593 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:28.593 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:28.851 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:28.851 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:28.851 20:26:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:28.851 [2024-07-15 20:26:21.195028] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:28.851 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8de465bc-6976-4c0b-a43f-a0b97b2d75ab 00:12:28.851 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8de465bc-6976-4c0b-a43f-a0b97b2d75ab ']' 00:12:28.851 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:29.109 [2024-07-15 20:26:21.447447] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:29.109 [2024-07-15 20:26:21.447466] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:29.109 [2024-07-15 20:26:21.447517] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:29.109 [2024-07-15 20:26:21.447560] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:29.109 [2024-07-15 20:26:21.447572] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d5270 name raid_bdev1, state offline 00:12:29.109 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.109 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:29.367 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:29.367 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:29.367 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:29.367 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:29.625 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:29.625 20:26:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:29.884 20:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:29.884 20:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:30.142 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:30.401 [2024-07-15 20:26:22.674656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:30.401 [2024-07-15 20:26:22.676059] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:30.401 [2024-07-15 20:26:22.676114] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:30.401 [2024-07-15 20:26:22.676155] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:30.401 [2024-07-15 20:26:22.676174] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:30.401 [2024-07-15 20:26:22.676184] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d4ff0 name raid_bdev1, state configuring 00:12:30.401 request: 00:12:30.401 { 00:12:30.401 "name": "raid_bdev1", 00:12:30.401 "raid_level": "concat", 00:12:30.401 "base_bdevs": [ 00:12:30.401 "malloc1", 00:12:30.401 "malloc2" 00:12:30.401 ], 00:12:30.401 "strip_size_kb": 64, 00:12:30.401 "superblock": false, 00:12:30.401 "method": "bdev_raid_create", 00:12:30.401 "req_id": 1 00:12:30.401 } 00:12:30.401 Got JSON-RPC error response 00:12:30.401 response: 00:12:30.401 { 00:12:30.401 "code": -17, 00:12:30.401 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:30.401 } 00:12:30.401 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:30.401 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:30.401 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:30.401 20:26:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:30.401 20:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.401 20:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:30.659 20:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:30.659 20:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:30.659 20:26:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:30.918 [2024-07-15 20:26:23.163873] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:30.918 [2024-07-15 20:26:23.163914] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:30.918 [2024-07-15 20:26:23.163941] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13317a0 00:12:30.918 [2024-07-15 20:26:23.163954] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:30.918 [2024-07-15 20:26:23.165527] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:30.918 [2024-07-15 20:26:23.165556] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:30.918 [2024-07-15 20:26:23.165624] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:30.918 [2024-07-15 20:26:23.165649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:30.918 pt1 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.918 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:31.183 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:31.183 "name": "raid_bdev1", 00:12:31.183 "uuid": "8de465bc-6976-4c0b-a43f-a0b97b2d75ab", 00:12:31.183 "strip_size_kb": 64, 00:12:31.183 "state": "configuring", 00:12:31.183 "raid_level": "concat", 00:12:31.183 "superblock": true, 00:12:31.183 "num_base_bdevs": 2, 00:12:31.183 "num_base_bdevs_discovered": 1, 00:12:31.183 "num_base_bdevs_operational": 2, 00:12:31.183 "base_bdevs_list": [ 00:12:31.183 { 00:12:31.183 "name": "pt1", 00:12:31.183 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:31.183 "is_configured": true, 00:12:31.183 "data_offset": 2048, 00:12:31.183 "data_size": 63488 00:12:31.183 }, 00:12:31.183 { 00:12:31.183 "name": null, 00:12:31.183 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:31.183 "is_configured": false, 00:12:31.183 "data_offset": 2048, 00:12:31.183 "data_size": 63488 00:12:31.183 } 00:12:31.183 ] 00:12:31.183 }' 00:12:31.183 20:26:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:31.183 20:26:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.795 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:31.795 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:31.795 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:31.795 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:32.053 [2024-07-15 20:26:24.246760] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:32.053 [2024-07-15 20:26:24.246806] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:32.053 [2024-07-15 20:26:24.246824] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14cb820 00:12:32.053 [2024-07-15 20:26:24.246837] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:32.053 [2024-07-15 20:26:24.247195] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:32.053 [2024-07-15 20:26:24.247217] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:32.053 [2024-07-15 20:26:24.247282] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:32.053 [2024-07-15 20:26:24.247301] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:32.053 [2024-07-15 20:26:24.247395] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1327ec0 00:12:32.054 [2024-07-15 20:26:24.247406] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:32.054 [2024-07-15 20:26:24.247573] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1328f00 00:12:32.054 [2024-07-15 20:26:24.247696] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1327ec0 00:12:32.054 [2024-07-15 20:26:24.247706] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1327ec0 00:12:32.054 [2024-07-15 20:26:24.247803] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:32.054 pt2 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.054 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:32.312 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.312 "name": "raid_bdev1", 00:12:32.312 "uuid": "8de465bc-6976-4c0b-a43f-a0b97b2d75ab", 00:12:32.312 "strip_size_kb": 64, 00:12:32.312 "state": "online", 00:12:32.312 "raid_level": "concat", 00:12:32.312 "superblock": true, 00:12:32.312 "num_base_bdevs": 2, 00:12:32.312 "num_base_bdevs_discovered": 2, 00:12:32.312 "num_base_bdevs_operational": 2, 00:12:32.312 "base_bdevs_list": [ 00:12:32.312 { 00:12:32.312 "name": "pt1", 00:12:32.312 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:32.312 "is_configured": true, 00:12:32.312 "data_offset": 2048, 00:12:32.312 "data_size": 63488 00:12:32.312 }, 00:12:32.312 { 00:12:32.312 "name": "pt2", 00:12:32.312 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:32.312 "is_configured": true, 00:12:32.312 "data_offset": 2048, 00:12:32.312 "data_size": 63488 00:12:32.312 } 00:12:32.312 ] 00:12:32.312 }' 00:12:32.312 20:26:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.312 20:26:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:32.899 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:32.899 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:32.899 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:32.899 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:32.899 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:32.899 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:32.899 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:32.899 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:33.158 [2024-07-15 20:26:25.345917] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:33.158 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:33.158 "name": "raid_bdev1", 00:12:33.158 "aliases": [ 00:12:33.158 "8de465bc-6976-4c0b-a43f-a0b97b2d75ab" 00:12:33.158 ], 00:12:33.158 "product_name": "Raid Volume", 00:12:33.158 "block_size": 512, 00:12:33.158 "num_blocks": 126976, 00:12:33.158 "uuid": "8de465bc-6976-4c0b-a43f-a0b97b2d75ab", 00:12:33.158 "assigned_rate_limits": { 00:12:33.158 "rw_ios_per_sec": 0, 00:12:33.158 "rw_mbytes_per_sec": 0, 00:12:33.158 "r_mbytes_per_sec": 0, 00:12:33.158 "w_mbytes_per_sec": 0 00:12:33.158 }, 00:12:33.158 "claimed": false, 00:12:33.158 "zoned": false, 00:12:33.158 "supported_io_types": { 00:12:33.158 "read": true, 00:12:33.158 "write": true, 00:12:33.158 "unmap": true, 00:12:33.158 "flush": true, 00:12:33.158 "reset": true, 00:12:33.158 "nvme_admin": false, 00:12:33.158 "nvme_io": false, 00:12:33.158 "nvme_io_md": false, 00:12:33.158 "write_zeroes": true, 00:12:33.158 "zcopy": false, 00:12:33.158 "get_zone_info": false, 00:12:33.158 "zone_management": false, 00:12:33.158 "zone_append": false, 00:12:33.158 "compare": false, 00:12:33.158 "compare_and_write": false, 00:12:33.158 "abort": false, 00:12:33.158 "seek_hole": false, 00:12:33.158 "seek_data": false, 00:12:33.158 "copy": false, 00:12:33.158 "nvme_iov_md": false 00:12:33.158 }, 00:12:33.158 "memory_domains": [ 00:12:33.158 { 00:12:33.158 "dma_device_id": "system", 00:12:33.158 "dma_device_type": 1 00:12:33.158 }, 00:12:33.158 { 00:12:33.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.158 "dma_device_type": 2 00:12:33.158 }, 00:12:33.158 { 00:12:33.158 "dma_device_id": "system", 00:12:33.158 "dma_device_type": 1 00:12:33.158 }, 00:12:33.158 { 00:12:33.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.158 "dma_device_type": 2 00:12:33.158 } 00:12:33.158 ], 00:12:33.158 "driver_specific": { 00:12:33.158 "raid": { 00:12:33.158 "uuid": "8de465bc-6976-4c0b-a43f-a0b97b2d75ab", 00:12:33.158 "strip_size_kb": 64, 00:12:33.158 "state": "online", 00:12:33.158 "raid_level": "concat", 00:12:33.158 "superblock": true, 00:12:33.158 "num_base_bdevs": 2, 00:12:33.158 "num_base_bdevs_discovered": 2, 00:12:33.158 "num_base_bdevs_operational": 2, 00:12:33.158 "base_bdevs_list": [ 00:12:33.158 { 00:12:33.158 "name": "pt1", 00:12:33.158 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:33.158 "is_configured": true, 00:12:33.158 "data_offset": 2048, 00:12:33.158 "data_size": 63488 00:12:33.158 }, 00:12:33.158 { 00:12:33.158 "name": "pt2", 00:12:33.158 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:33.158 "is_configured": true, 00:12:33.158 "data_offset": 2048, 00:12:33.158 "data_size": 63488 00:12:33.158 } 00:12:33.158 ] 00:12:33.158 } 00:12:33.158 } 00:12:33.158 }' 00:12:33.158 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:33.158 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:33.158 pt2' 00:12:33.158 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:33.158 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:33.158 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:33.416 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:33.416 "name": "pt1", 00:12:33.416 "aliases": [ 00:12:33.416 "00000000-0000-0000-0000-000000000001" 00:12:33.416 ], 00:12:33.416 "product_name": "passthru", 00:12:33.416 "block_size": 512, 00:12:33.416 "num_blocks": 65536, 00:12:33.416 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:33.416 "assigned_rate_limits": { 00:12:33.416 "rw_ios_per_sec": 0, 00:12:33.416 "rw_mbytes_per_sec": 0, 00:12:33.416 "r_mbytes_per_sec": 0, 00:12:33.416 "w_mbytes_per_sec": 0 00:12:33.416 }, 00:12:33.416 "claimed": true, 00:12:33.416 "claim_type": "exclusive_write", 00:12:33.416 "zoned": false, 00:12:33.416 "supported_io_types": { 00:12:33.416 "read": true, 00:12:33.416 "write": true, 00:12:33.416 "unmap": true, 00:12:33.416 "flush": true, 00:12:33.416 "reset": true, 00:12:33.416 "nvme_admin": false, 00:12:33.416 "nvme_io": false, 00:12:33.416 "nvme_io_md": false, 00:12:33.416 "write_zeroes": true, 00:12:33.416 "zcopy": true, 00:12:33.416 "get_zone_info": false, 00:12:33.416 "zone_management": false, 00:12:33.416 "zone_append": false, 00:12:33.416 "compare": false, 00:12:33.416 "compare_and_write": false, 00:12:33.416 "abort": true, 00:12:33.416 "seek_hole": false, 00:12:33.416 "seek_data": false, 00:12:33.416 "copy": true, 00:12:33.416 "nvme_iov_md": false 00:12:33.416 }, 00:12:33.416 "memory_domains": [ 00:12:33.416 { 00:12:33.416 "dma_device_id": "system", 00:12:33.416 "dma_device_type": 1 00:12:33.416 }, 00:12:33.416 { 00:12:33.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.416 "dma_device_type": 2 00:12:33.416 } 00:12:33.416 ], 00:12:33.416 "driver_specific": { 00:12:33.416 "passthru": { 00:12:33.416 "name": "pt1", 00:12:33.416 "base_bdev_name": "malloc1" 00:12:33.416 } 00:12:33.416 } 00:12:33.416 }' 00:12:33.416 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:33.416 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:33.416 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:33.416 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:33.675 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:33.675 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:33.675 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:33.675 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:33.675 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:33.675 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.675 20:26:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.675 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:33.675 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:33.675 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:33.675 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:33.933 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:33.933 "name": "pt2", 00:12:33.933 "aliases": [ 00:12:33.933 "00000000-0000-0000-0000-000000000002" 00:12:33.933 ], 00:12:33.933 "product_name": "passthru", 00:12:33.933 "block_size": 512, 00:12:33.933 "num_blocks": 65536, 00:12:33.934 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:33.934 "assigned_rate_limits": { 00:12:33.934 "rw_ios_per_sec": 0, 00:12:33.934 "rw_mbytes_per_sec": 0, 00:12:33.934 "r_mbytes_per_sec": 0, 00:12:33.934 "w_mbytes_per_sec": 0 00:12:33.934 }, 00:12:33.934 "claimed": true, 00:12:33.934 "claim_type": "exclusive_write", 00:12:33.934 "zoned": false, 00:12:33.934 "supported_io_types": { 00:12:33.934 "read": true, 00:12:33.934 "write": true, 00:12:33.934 "unmap": true, 00:12:33.934 "flush": true, 00:12:33.934 "reset": true, 00:12:33.934 "nvme_admin": false, 00:12:33.934 "nvme_io": false, 00:12:33.934 "nvme_io_md": false, 00:12:33.934 "write_zeroes": true, 00:12:33.934 "zcopy": true, 00:12:33.934 "get_zone_info": false, 00:12:33.934 "zone_management": false, 00:12:33.934 "zone_append": false, 00:12:33.934 "compare": false, 00:12:33.934 "compare_and_write": false, 00:12:33.934 "abort": true, 00:12:33.934 "seek_hole": false, 00:12:33.934 "seek_data": false, 00:12:33.934 "copy": true, 00:12:33.934 "nvme_iov_md": false 00:12:33.934 }, 00:12:33.934 "memory_domains": [ 00:12:33.934 { 00:12:33.934 "dma_device_id": "system", 00:12:33.934 "dma_device_type": 1 00:12:33.934 }, 00:12:33.934 { 00:12:33.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.934 "dma_device_type": 2 00:12:33.934 } 00:12:33.934 ], 00:12:33.934 "driver_specific": { 00:12:33.934 "passthru": { 00:12:33.934 "name": "pt2", 00:12:33.934 "base_bdev_name": "malloc2" 00:12:33.934 } 00:12:33.934 } 00:12:33.934 }' 00:12:33.934 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.192 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.192 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:34.192 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:34.192 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:34.192 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:34.192 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:34.192 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:34.192 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:34.192 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:34.450 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:34.450 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:34.450 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:34.450 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:34.708 [2024-07-15 20:26:26.857917] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8de465bc-6976-4c0b-a43f-a0b97b2d75ab '!=' 8de465bc-6976-4c0b-a43f-a0b97b2d75ab ']' 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1358381 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1358381 ']' 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1358381 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1358381 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1358381' 00:12:34.709 killing process with pid 1358381 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1358381 00:12:34.709 [2024-07-15 20:26:26.934406] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:34.709 [2024-07-15 20:26:26.934460] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:34.709 [2024-07-15 20:26:26.934508] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:34.709 [2024-07-15 20:26:26.934520] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1327ec0 name raid_bdev1, state offline 00:12:34.709 20:26:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1358381 00:12:34.709 [2024-07-15 20:26:26.952277] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:34.968 20:26:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:34.968 00:12:34.968 real 0m10.704s 00:12:34.968 user 0m19.187s 00:12:34.968 sys 0m1.886s 00:12:34.968 20:26:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:34.968 20:26:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.968 ************************************ 00:12:34.968 END TEST raid_superblock_test 00:12:34.968 ************************************ 00:12:34.968 20:26:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:34.968 20:26:27 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:34.968 20:26:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:34.968 20:26:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:34.968 20:26:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:34.968 ************************************ 00:12:34.968 START TEST raid_read_error_test 00:12:34.968 ************************************ 00:12:34.968 20:26:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:12:34.968 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:34.968 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:34.968 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.iWn4AAcIiE 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1360007 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1360007 /var/tmp/spdk-raid.sock 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1360007 ']' 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:34.969 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:34.969 20:26:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.969 [2024-07-15 20:26:27.337008] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:12:34.969 [2024-07-15 20:26:27.337080] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1360007 ] 00:12:35.228 [2024-07-15 20:26:27.467990] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:35.228 [2024-07-15 20:26:27.573869] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:35.486 [2024-07-15 20:26:27.632427] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:35.486 [2024-07-15 20:26:27.632453] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:36.052 20:26:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:36.052 20:26:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:36.052 20:26:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:36.052 20:26:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:36.311 BaseBdev1_malloc 00:12:36.311 20:26:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:36.568 true 00:12:36.569 20:26:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:36.826 [2024-07-15 20:26:29.005230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:36.826 [2024-07-15 20:26:29.005274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:36.826 [2024-07-15 20:26:29.005292] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15cb0d0 00:12:36.826 [2024-07-15 20:26:29.005305] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:36.826 [2024-07-15 20:26:29.006991] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:36.826 [2024-07-15 20:26:29.007020] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:36.826 BaseBdev1 00:12:36.826 20:26:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:36.826 20:26:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:37.084 BaseBdev2_malloc 00:12:37.084 20:26:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:37.341 true 00:12:37.341 20:26:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:37.599 [2024-07-15 20:26:29.751853] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:37.599 [2024-07-15 20:26:29.751900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:37.599 [2024-07-15 20:26:29.751921] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15cf910 00:12:37.599 [2024-07-15 20:26:29.751943] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:37.599 [2024-07-15 20:26:29.753423] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:37.599 [2024-07-15 20:26:29.753452] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:37.599 BaseBdev2 00:12:37.599 20:26:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:37.857 [2024-07-15 20:26:30.000538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:37.857 [2024-07-15 20:26:30.001738] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:37.857 [2024-07-15 20:26:30.001922] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15d1320 00:12:37.857 [2024-07-15 20:26:30.001945] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:37.857 [2024-07-15 20:26:30.002129] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15d2290 00:12:37.857 [2024-07-15 20:26:30.002271] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15d1320 00:12:37.857 [2024-07-15 20:26:30.002281] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15d1320 00:12:37.857 [2024-07-15 20:26:30.002379] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.857 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:38.115 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.115 "name": "raid_bdev1", 00:12:38.115 "uuid": "aad0d5ee-306a-425d-9dd8-64d456354d17", 00:12:38.115 "strip_size_kb": 64, 00:12:38.115 "state": "online", 00:12:38.115 "raid_level": "concat", 00:12:38.115 "superblock": true, 00:12:38.115 "num_base_bdevs": 2, 00:12:38.115 "num_base_bdevs_discovered": 2, 00:12:38.115 "num_base_bdevs_operational": 2, 00:12:38.115 "base_bdevs_list": [ 00:12:38.115 { 00:12:38.115 "name": "BaseBdev1", 00:12:38.115 "uuid": "c87b2814-6456-5515-b485-fc7fc94d1269", 00:12:38.115 "is_configured": true, 00:12:38.115 "data_offset": 2048, 00:12:38.115 "data_size": 63488 00:12:38.115 }, 00:12:38.115 { 00:12:38.115 "name": "BaseBdev2", 00:12:38.115 "uuid": "99614470-c15b-57b0-aec4-52c23e98db21", 00:12:38.115 "is_configured": true, 00:12:38.115 "data_offset": 2048, 00:12:38.115 "data_size": 63488 00:12:38.115 } 00:12:38.115 ] 00:12:38.115 }' 00:12:38.115 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.115 20:26:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:38.682 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:38.682 20:26:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:38.682 [2024-07-15 20:26:31.043610] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15cc9b0 00:12:39.617 20:26:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:39.875 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.876 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:40.134 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.134 "name": "raid_bdev1", 00:12:40.134 "uuid": "aad0d5ee-306a-425d-9dd8-64d456354d17", 00:12:40.134 "strip_size_kb": 64, 00:12:40.134 "state": "online", 00:12:40.134 "raid_level": "concat", 00:12:40.134 "superblock": true, 00:12:40.134 "num_base_bdevs": 2, 00:12:40.134 "num_base_bdevs_discovered": 2, 00:12:40.134 "num_base_bdevs_operational": 2, 00:12:40.134 "base_bdevs_list": [ 00:12:40.134 { 00:12:40.134 "name": "BaseBdev1", 00:12:40.134 "uuid": "c87b2814-6456-5515-b485-fc7fc94d1269", 00:12:40.134 "is_configured": true, 00:12:40.134 "data_offset": 2048, 00:12:40.134 "data_size": 63488 00:12:40.134 }, 00:12:40.134 { 00:12:40.134 "name": "BaseBdev2", 00:12:40.134 "uuid": "99614470-c15b-57b0-aec4-52c23e98db21", 00:12:40.134 "is_configured": true, 00:12:40.134 "data_offset": 2048, 00:12:40.134 "data_size": 63488 00:12:40.134 } 00:12:40.134 ] 00:12:40.134 }' 00:12:40.134 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.134 20:26:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.700 20:26:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:40.700 [2024-07-15 20:26:33.058195] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:40.700 [2024-07-15 20:26:33.058240] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:40.700 [2024-07-15 20:26:33.061409] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:40.700 [2024-07-15 20:26:33.061440] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:40.700 [2024-07-15 20:26:33.061468] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:40.700 [2024-07-15 20:26:33.061479] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15d1320 name raid_bdev1, state offline 00:12:40.700 0 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1360007 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1360007 ']' 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1360007 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1360007 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1360007' 00:12:40.959 killing process with pid 1360007 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1360007 00:12:40.959 [2024-07-15 20:26:33.122827] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:40.959 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1360007 00:12:40.959 [2024-07-15 20:26:33.133517] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:41.218 20:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.iWn4AAcIiE 00:12:41.218 20:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:41.218 20:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:41.218 20:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:12:41.218 20:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:41.218 20:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:41.218 20:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:41.218 20:26:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:12:41.218 00:12:41.218 real 0m6.118s 00:12:41.218 user 0m9.525s 00:12:41.218 sys 0m1.077s 00:12:41.218 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:41.218 20:26:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.218 ************************************ 00:12:41.218 END TEST raid_read_error_test 00:12:41.218 ************************************ 00:12:41.218 20:26:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:41.218 20:26:33 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:12:41.218 20:26:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:41.218 20:26:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:41.218 20:26:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:41.218 ************************************ 00:12:41.218 START TEST raid_write_error_test 00:12:41.218 ************************************ 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.zZczHoBphb 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1360933 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1360933 /var/tmp/spdk-raid.sock 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1360933 ']' 00:12:41.218 20:26:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:41.219 20:26:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:41.219 20:26:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:41.219 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:41.219 20:26:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:41.219 20:26:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.219 [2024-07-15 20:26:33.537000] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:12:41.219 [2024-07-15 20:26:33.537067] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1360933 ] 00:12:41.477 [2024-07-15 20:26:33.668302] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:41.477 [2024-07-15 20:26:33.774701] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:41.477 [2024-07-15 20:26:33.846117] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:41.477 [2024-07-15 20:26:33.846149] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:42.413 20:26:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:42.413 20:26:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:42.413 20:26:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:42.413 20:26:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:42.413 BaseBdev1_malloc 00:12:42.413 20:26:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:42.671 true 00:12:42.671 20:26:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:42.929 [2024-07-15 20:26:35.241991] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:42.929 [2024-07-15 20:26:35.242036] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:42.929 [2024-07-15 20:26:35.242057] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cbe0d0 00:12:42.929 [2024-07-15 20:26:35.242070] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:42.929 [2024-07-15 20:26:35.243967] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:42.929 [2024-07-15 20:26:35.243996] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:42.929 BaseBdev1 00:12:42.929 20:26:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:42.929 20:26:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:43.187 BaseBdev2_malloc 00:12:43.188 20:26:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:43.446 true 00:12:43.446 20:26:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:43.704 [2024-07-15 20:26:35.969038] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:43.704 [2024-07-15 20:26:35.969083] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:43.704 [2024-07-15 20:26:35.969103] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cc2910 00:12:43.704 [2024-07-15 20:26:35.969116] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:43.704 [2024-07-15 20:26:35.970695] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:43.704 [2024-07-15 20:26:35.970724] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:43.704 BaseBdev2 00:12:43.704 20:26:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:43.963 [2024-07-15 20:26:36.213716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:43.963 [2024-07-15 20:26:36.215102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:43.963 [2024-07-15 20:26:36.215293] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cc4320 00:12:43.963 [2024-07-15 20:26:36.215307] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:43.963 [2024-07-15 20:26:36.215510] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc5290 00:12:43.963 [2024-07-15 20:26:36.215660] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cc4320 00:12:43.963 [2024-07-15 20:26:36.215670] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cc4320 00:12:43.963 [2024-07-15 20:26:36.215776] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.963 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:44.221 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.221 "name": "raid_bdev1", 00:12:44.221 "uuid": "63552d84-5760-4886-bf5d-f1cac0fd998a", 00:12:44.221 "strip_size_kb": 64, 00:12:44.221 "state": "online", 00:12:44.221 "raid_level": "concat", 00:12:44.221 "superblock": true, 00:12:44.221 "num_base_bdevs": 2, 00:12:44.221 "num_base_bdevs_discovered": 2, 00:12:44.221 "num_base_bdevs_operational": 2, 00:12:44.221 "base_bdevs_list": [ 00:12:44.221 { 00:12:44.221 "name": "BaseBdev1", 00:12:44.222 "uuid": "9c04ae33-85f6-5f73-9406-015f19ad6668", 00:12:44.222 "is_configured": true, 00:12:44.222 "data_offset": 2048, 00:12:44.222 "data_size": 63488 00:12:44.222 }, 00:12:44.222 { 00:12:44.222 "name": "BaseBdev2", 00:12:44.222 "uuid": "c4802c76-6ff8-5f41-89ac-9ab83e99a2e4", 00:12:44.222 "is_configured": true, 00:12:44.222 "data_offset": 2048, 00:12:44.222 "data_size": 63488 00:12:44.222 } 00:12:44.222 ] 00:12:44.222 }' 00:12:44.222 20:26:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.222 20:26:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.789 20:26:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:44.789 20:26:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:44.789 [2024-07-15 20:26:37.156498] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cbf9b0 00:12:45.726 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.999 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:46.271 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.271 "name": "raid_bdev1", 00:12:46.271 "uuid": "63552d84-5760-4886-bf5d-f1cac0fd998a", 00:12:46.271 "strip_size_kb": 64, 00:12:46.271 "state": "online", 00:12:46.271 "raid_level": "concat", 00:12:46.271 "superblock": true, 00:12:46.271 "num_base_bdevs": 2, 00:12:46.271 "num_base_bdevs_discovered": 2, 00:12:46.271 "num_base_bdevs_operational": 2, 00:12:46.271 "base_bdevs_list": [ 00:12:46.271 { 00:12:46.271 "name": "BaseBdev1", 00:12:46.271 "uuid": "9c04ae33-85f6-5f73-9406-015f19ad6668", 00:12:46.271 "is_configured": true, 00:12:46.271 "data_offset": 2048, 00:12:46.271 "data_size": 63488 00:12:46.271 }, 00:12:46.271 { 00:12:46.271 "name": "BaseBdev2", 00:12:46.271 "uuid": "c4802c76-6ff8-5f41-89ac-9ab83e99a2e4", 00:12:46.271 "is_configured": true, 00:12:46.271 "data_offset": 2048, 00:12:46.271 "data_size": 63488 00:12:46.271 } 00:12:46.271 ] 00:12:46.271 }' 00:12:46.271 20:26:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.271 20:26:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.836 20:26:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:47.095 [2024-07-15 20:26:39.275979] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:47.095 [2024-07-15 20:26:39.276026] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:47.095 [2024-07-15 20:26:39.279317] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:47.095 [2024-07-15 20:26:39.279348] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:47.095 [2024-07-15 20:26:39.279376] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:47.095 [2024-07-15 20:26:39.279387] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cc4320 name raid_bdev1, state offline 00:12:47.095 0 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1360933 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1360933 ']' 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1360933 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1360933 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1360933' 00:12:47.095 killing process with pid 1360933 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1360933 00:12:47.095 [2024-07-15 20:26:39.359966] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:47.095 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1360933 00:12:47.095 [2024-07-15 20:26:39.370725] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:47.354 20:26:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.zZczHoBphb 00:12:47.354 20:26:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:47.354 20:26:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:47.354 20:26:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:12:47.354 20:26:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:47.354 20:26:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:47.354 20:26:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:47.354 20:26:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:12:47.354 00:12:47.354 real 0m6.153s 00:12:47.354 user 0m9.518s 00:12:47.354 sys 0m1.118s 00:12:47.354 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:47.354 20:26:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.354 ************************************ 00:12:47.354 END TEST raid_write_error_test 00:12:47.354 ************************************ 00:12:47.354 20:26:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:47.354 20:26:39 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:47.354 20:26:39 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:12:47.354 20:26:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:47.354 20:26:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:47.354 20:26:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:47.354 ************************************ 00:12:47.354 START TEST raid_state_function_test 00:12:47.354 ************************************ 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1361801 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1361801' 00:12:47.354 Process raid pid: 1361801 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1361801 /var/tmp/spdk-raid.sock 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1361801 ']' 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:47.354 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:47.354 20:26:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.613 [2024-07-15 20:26:39.760584] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:12:47.613 [2024-07-15 20:26:39.760648] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:47.613 [2024-07-15 20:26:39.889953] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:47.613 [2024-07-15 20:26:39.988184] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:47.871 [2024-07-15 20:26:40.059030] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:47.871 [2024-07-15 20:26:40.059064] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:48.438 20:26:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:48.438 20:26:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:48.438 20:26:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:49.006 [2024-07-15 20:26:41.186572] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:49.006 [2024-07-15 20:26:41.186614] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:49.006 [2024-07-15 20:26:41.186625] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:49.006 [2024-07-15 20:26:41.186637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.006 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:49.584 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:49.584 "name": "Existed_Raid", 00:12:49.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.584 "strip_size_kb": 0, 00:12:49.584 "state": "configuring", 00:12:49.584 "raid_level": "raid1", 00:12:49.584 "superblock": false, 00:12:49.584 "num_base_bdevs": 2, 00:12:49.584 "num_base_bdevs_discovered": 0, 00:12:49.584 "num_base_bdevs_operational": 2, 00:12:49.584 "base_bdevs_list": [ 00:12:49.584 { 00:12:49.584 "name": "BaseBdev1", 00:12:49.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.584 "is_configured": false, 00:12:49.584 "data_offset": 0, 00:12:49.584 "data_size": 0 00:12:49.584 }, 00:12:49.584 { 00:12:49.584 "name": "BaseBdev2", 00:12:49.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.584 "is_configured": false, 00:12:49.584 "data_offset": 0, 00:12:49.584 "data_size": 0 00:12:49.584 } 00:12:49.584 ] 00:12:49.584 }' 00:12:49.584 20:26:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:49.584 20:26:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:50.151 20:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:50.424 [2024-07-15 20:26:42.554056] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:50.424 [2024-07-15 20:26:42.554085] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf76a80 name Existed_Raid, state configuring 00:12:50.424 20:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:50.424 [2024-07-15 20:26:42.802723] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:50.424 [2024-07-15 20:26:42.802753] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:50.424 [2024-07-15 20:26:42.802762] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:50.424 [2024-07-15 20:26:42.802774] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:50.683 20:26:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:50.683 [2024-07-15 20:26:43.053237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:50.683 BaseBdev1 00:12:50.942 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:50.942 20:26:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:50.942 20:26:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:50.942 20:26:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:50.942 20:26:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:50.942 20:26:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:50.942 20:26:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:51.201 [ 00:12:51.201 { 00:12:51.201 "name": "BaseBdev1", 00:12:51.201 "aliases": [ 00:12:51.201 "2d4fe97f-bd5d-4c7c-b5bf-7e4d492b36ff" 00:12:51.201 ], 00:12:51.201 "product_name": "Malloc disk", 00:12:51.201 "block_size": 512, 00:12:51.201 "num_blocks": 65536, 00:12:51.201 "uuid": "2d4fe97f-bd5d-4c7c-b5bf-7e4d492b36ff", 00:12:51.201 "assigned_rate_limits": { 00:12:51.201 "rw_ios_per_sec": 0, 00:12:51.201 "rw_mbytes_per_sec": 0, 00:12:51.201 "r_mbytes_per_sec": 0, 00:12:51.201 "w_mbytes_per_sec": 0 00:12:51.201 }, 00:12:51.201 "claimed": true, 00:12:51.201 "claim_type": "exclusive_write", 00:12:51.201 "zoned": false, 00:12:51.201 "supported_io_types": { 00:12:51.201 "read": true, 00:12:51.201 "write": true, 00:12:51.201 "unmap": true, 00:12:51.201 "flush": true, 00:12:51.201 "reset": true, 00:12:51.201 "nvme_admin": false, 00:12:51.201 "nvme_io": false, 00:12:51.201 "nvme_io_md": false, 00:12:51.201 "write_zeroes": true, 00:12:51.201 "zcopy": true, 00:12:51.201 "get_zone_info": false, 00:12:51.201 "zone_management": false, 00:12:51.201 "zone_append": false, 00:12:51.201 "compare": false, 00:12:51.201 "compare_and_write": false, 00:12:51.201 "abort": true, 00:12:51.201 "seek_hole": false, 00:12:51.201 "seek_data": false, 00:12:51.201 "copy": true, 00:12:51.201 "nvme_iov_md": false 00:12:51.201 }, 00:12:51.201 "memory_domains": [ 00:12:51.201 { 00:12:51.201 "dma_device_id": "system", 00:12:51.201 "dma_device_type": 1 00:12:51.201 }, 00:12:51.201 { 00:12:51.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.201 "dma_device_type": 2 00:12:51.201 } 00:12:51.201 ], 00:12:51.201 "driver_specific": {} 00:12:51.201 } 00:12:51.201 ] 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.201 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.459 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.459 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:51.459 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.459 "name": "Existed_Raid", 00:12:51.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.459 "strip_size_kb": 0, 00:12:51.459 "state": "configuring", 00:12:51.459 "raid_level": "raid1", 00:12:51.459 "superblock": false, 00:12:51.459 "num_base_bdevs": 2, 00:12:51.459 "num_base_bdevs_discovered": 1, 00:12:51.459 "num_base_bdevs_operational": 2, 00:12:51.459 "base_bdevs_list": [ 00:12:51.459 { 00:12:51.459 "name": "BaseBdev1", 00:12:51.459 "uuid": "2d4fe97f-bd5d-4c7c-b5bf-7e4d492b36ff", 00:12:51.459 "is_configured": true, 00:12:51.459 "data_offset": 0, 00:12:51.459 "data_size": 65536 00:12:51.459 }, 00:12:51.459 { 00:12:51.459 "name": "BaseBdev2", 00:12:51.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.459 "is_configured": false, 00:12:51.459 "data_offset": 0, 00:12:51.459 "data_size": 0 00:12:51.459 } 00:12:51.459 ] 00:12:51.459 }' 00:12:51.459 20:26:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.459 20:26:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.396 20:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:52.396 [2024-07-15 20:26:44.681675] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:52.396 [2024-07-15 20:26:44.681715] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf76350 name Existed_Raid, state configuring 00:12:52.396 20:26:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:52.964 [2024-07-15 20:26:45.183019] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:52.964 [2024-07-15 20:26:45.184550] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:52.964 [2024-07-15 20:26:45.184583] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.964 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:53.222 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.222 "name": "Existed_Raid", 00:12:53.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.222 "strip_size_kb": 0, 00:12:53.222 "state": "configuring", 00:12:53.222 "raid_level": "raid1", 00:12:53.222 "superblock": false, 00:12:53.222 "num_base_bdevs": 2, 00:12:53.222 "num_base_bdevs_discovered": 1, 00:12:53.222 "num_base_bdevs_operational": 2, 00:12:53.222 "base_bdevs_list": [ 00:12:53.222 { 00:12:53.222 "name": "BaseBdev1", 00:12:53.222 "uuid": "2d4fe97f-bd5d-4c7c-b5bf-7e4d492b36ff", 00:12:53.222 "is_configured": true, 00:12:53.222 "data_offset": 0, 00:12:53.222 "data_size": 65536 00:12:53.222 }, 00:12:53.222 { 00:12:53.222 "name": "BaseBdev2", 00:12:53.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.222 "is_configured": false, 00:12:53.223 "data_offset": 0, 00:12:53.223 "data_size": 0 00:12:53.223 } 00:12:53.223 ] 00:12:53.223 }' 00:12:53.223 20:26:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.223 20:26:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.789 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:54.048 [2024-07-15 20:26:46.297293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:54.048 [2024-07-15 20:26:46.297333] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf77000 00:12:54.048 [2024-07-15 20:26:46.297342] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:54.048 [2024-07-15 20:26:46.297527] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe910c0 00:12:54.048 [2024-07-15 20:26:46.297643] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf77000 00:12:54.048 [2024-07-15 20:26:46.297653] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf77000 00:12:54.048 [2024-07-15 20:26:46.297813] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:54.048 BaseBdev2 00:12:54.048 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:54.048 20:26:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:54.048 20:26:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:54.048 20:26:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:54.048 20:26:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:54.048 20:26:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:54.048 20:26:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:54.307 20:26:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:54.566 [ 00:12:54.566 { 00:12:54.566 "name": "BaseBdev2", 00:12:54.566 "aliases": [ 00:12:54.566 "3d2d20b5-e9d9-4557-a445-da0a5ab2c2eb" 00:12:54.566 ], 00:12:54.566 "product_name": "Malloc disk", 00:12:54.566 "block_size": 512, 00:12:54.566 "num_blocks": 65536, 00:12:54.566 "uuid": "3d2d20b5-e9d9-4557-a445-da0a5ab2c2eb", 00:12:54.566 "assigned_rate_limits": { 00:12:54.566 "rw_ios_per_sec": 0, 00:12:54.566 "rw_mbytes_per_sec": 0, 00:12:54.566 "r_mbytes_per_sec": 0, 00:12:54.566 "w_mbytes_per_sec": 0 00:12:54.566 }, 00:12:54.566 "claimed": true, 00:12:54.566 "claim_type": "exclusive_write", 00:12:54.566 "zoned": false, 00:12:54.566 "supported_io_types": { 00:12:54.566 "read": true, 00:12:54.566 "write": true, 00:12:54.566 "unmap": true, 00:12:54.566 "flush": true, 00:12:54.566 "reset": true, 00:12:54.566 "nvme_admin": false, 00:12:54.566 "nvme_io": false, 00:12:54.566 "nvme_io_md": false, 00:12:54.566 "write_zeroes": true, 00:12:54.566 "zcopy": true, 00:12:54.566 "get_zone_info": false, 00:12:54.566 "zone_management": false, 00:12:54.566 "zone_append": false, 00:12:54.566 "compare": false, 00:12:54.566 "compare_and_write": false, 00:12:54.566 "abort": true, 00:12:54.566 "seek_hole": false, 00:12:54.566 "seek_data": false, 00:12:54.566 "copy": true, 00:12:54.566 "nvme_iov_md": false 00:12:54.566 }, 00:12:54.566 "memory_domains": [ 00:12:54.566 { 00:12:54.566 "dma_device_id": "system", 00:12:54.566 "dma_device_type": 1 00:12:54.566 }, 00:12:54.566 { 00:12:54.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.566 "dma_device_type": 2 00:12:54.566 } 00:12:54.566 ], 00:12:54.566 "driver_specific": {} 00:12:54.566 } 00:12:54.566 ] 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.566 20:26:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:54.825 20:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.825 "name": "Existed_Raid", 00:12:54.825 "uuid": "26b58c5b-b210-40aa-bb4b-d3f5ebaf98a7", 00:12:54.825 "strip_size_kb": 0, 00:12:54.825 "state": "online", 00:12:54.825 "raid_level": "raid1", 00:12:54.825 "superblock": false, 00:12:54.825 "num_base_bdevs": 2, 00:12:54.825 "num_base_bdevs_discovered": 2, 00:12:54.825 "num_base_bdevs_operational": 2, 00:12:54.825 "base_bdevs_list": [ 00:12:54.825 { 00:12:54.825 "name": "BaseBdev1", 00:12:54.825 "uuid": "2d4fe97f-bd5d-4c7c-b5bf-7e4d492b36ff", 00:12:54.825 "is_configured": true, 00:12:54.825 "data_offset": 0, 00:12:54.825 "data_size": 65536 00:12:54.825 }, 00:12:54.825 { 00:12:54.825 "name": "BaseBdev2", 00:12:54.825 "uuid": "3d2d20b5-e9d9-4557-a445-da0a5ab2c2eb", 00:12:54.825 "is_configured": true, 00:12:54.825 "data_offset": 0, 00:12:54.825 "data_size": 65536 00:12:54.825 } 00:12:54.825 ] 00:12:54.825 }' 00:12:54.825 20:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.825 20:26:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.759 20:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:55.759 20:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:55.759 20:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:55.759 20:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:55.759 20:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:55.759 20:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:55.759 20:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:55.759 20:26:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:55.759 [2024-07-15 20:26:48.082306] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:55.759 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:55.759 "name": "Existed_Raid", 00:12:55.759 "aliases": [ 00:12:55.759 "26b58c5b-b210-40aa-bb4b-d3f5ebaf98a7" 00:12:55.759 ], 00:12:55.759 "product_name": "Raid Volume", 00:12:55.759 "block_size": 512, 00:12:55.760 "num_blocks": 65536, 00:12:55.760 "uuid": "26b58c5b-b210-40aa-bb4b-d3f5ebaf98a7", 00:12:55.760 "assigned_rate_limits": { 00:12:55.760 "rw_ios_per_sec": 0, 00:12:55.760 "rw_mbytes_per_sec": 0, 00:12:55.760 "r_mbytes_per_sec": 0, 00:12:55.760 "w_mbytes_per_sec": 0 00:12:55.760 }, 00:12:55.760 "claimed": false, 00:12:55.760 "zoned": false, 00:12:55.760 "supported_io_types": { 00:12:55.760 "read": true, 00:12:55.760 "write": true, 00:12:55.760 "unmap": false, 00:12:55.760 "flush": false, 00:12:55.760 "reset": true, 00:12:55.760 "nvme_admin": false, 00:12:55.760 "nvme_io": false, 00:12:55.760 "nvme_io_md": false, 00:12:55.760 "write_zeroes": true, 00:12:55.760 "zcopy": false, 00:12:55.760 "get_zone_info": false, 00:12:55.760 "zone_management": false, 00:12:55.760 "zone_append": false, 00:12:55.760 "compare": false, 00:12:55.760 "compare_and_write": false, 00:12:55.760 "abort": false, 00:12:55.760 "seek_hole": false, 00:12:55.760 "seek_data": false, 00:12:55.760 "copy": false, 00:12:55.760 "nvme_iov_md": false 00:12:55.760 }, 00:12:55.760 "memory_domains": [ 00:12:55.760 { 00:12:55.760 "dma_device_id": "system", 00:12:55.760 "dma_device_type": 1 00:12:55.760 }, 00:12:55.760 { 00:12:55.760 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.760 "dma_device_type": 2 00:12:55.760 }, 00:12:55.760 { 00:12:55.760 "dma_device_id": "system", 00:12:55.760 "dma_device_type": 1 00:12:55.760 }, 00:12:55.760 { 00:12:55.760 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.760 "dma_device_type": 2 00:12:55.760 } 00:12:55.760 ], 00:12:55.760 "driver_specific": { 00:12:55.760 "raid": { 00:12:55.760 "uuid": "26b58c5b-b210-40aa-bb4b-d3f5ebaf98a7", 00:12:55.760 "strip_size_kb": 0, 00:12:55.760 "state": "online", 00:12:55.760 "raid_level": "raid1", 00:12:55.760 "superblock": false, 00:12:55.760 "num_base_bdevs": 2, 00:12:55.760 "num_base_bdevs_discovered": 2, 00:12:55.760 "num_base_bdevs_operational": 2, 00:12:55.760 "base_bdevs_list": [ 00:12:55.760 { 00:12:55.760 "name": "BaseBdev1", 00:12:55.760 "uuid": "2d4fe97f-bd5d-4c7c-b5bf-7e4d492b36ff", 00:12:55.760 "is_configured": true, 00:12:55.760 "data_offset": 0, 00:12:55.760 "data_size": 65536 00:12:55.760 }, 00:12:55.760 { 00:12:55.760 "name": "BaseBdev2", 00:12:55.760 "uuid": "3d2d20b5-e9d9-4557-a445-da0a5ab2c2eb", 00:12:55.760 "is_configured": true, 00:12:55.760 "data_offset": 0, 00:12:55.760 "data_size": 65536 00:12:55.760 } 00:12:55.760 ] 00:12:55.760 } 00:12:55.760 } 00:12:55.760 }' 00:12:55.760 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:56.018 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:56.018 BaseBdev2' 00:12:56.018 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:56.018 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:56.018 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:56.277 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:56.277 "name": "BaseBdev1", 00:12:56.277 "aliases": [ 00:12:56.277 "2d4fe97f-bd5d-4c7c-b5bf-7e4d492b36ff" 00:12:56.277 ], 00:12:56.277 "product_name": "Malloc disk", 00:12:56.277 "block_size": 512, 00:12:56.277 "num_blocks": 65536, 00:12:56.277 "uuid": "2d4fe97f-bd5d-4c7c-b5bf-7e4d492b36ff", 00:12:56.277 "assigned_rate_limits": { 00:12:56.277 "rw_ios_per_sec": 0, 00:12:56.277 "rw_mbytes_per_sec": 0, 00:12:56.277 "r_mbytes_per_sec": 0, 00:12:56.277 "w_mbytes_per_sec": 0 00:12:56.277 }, 00:12:56.277 "claimed": true, 00:12:56.277 "claim_type": "exclusive_write", 00:12:56.277 "zoned": false, 00:12:56.277 "supported_io_types": { 00:12:56.277 "read": true, 00:12:56.277 "write": true, 00:12:56.277 "unmap": true, 00:12:56.277 "flush": true, 00:12:56.277 "reset": true, 00:12:56.277 "nvme_admin": false, 00:12:56.277 "nvme_io": false, 00:12:56.277 "nvme_io_md": false, 00:12:56.277 "write_zeroes": true, 00:12:56.277 "zcopy": true, 00:12:56.277 "get_zone_info": false, 00:12:56.277 "zone_management": false, 00:12:56.277 "zone_append": false, 00:12:56.277 "compare": false, 00:12:56.277 "compare_and_write": false, 00:12:56.277 "abort": true, 00:12:56.277 "seek_hole": false, 00:12:56.277 "seek_data": false, 00:12:56.277 "copy": true, 00:12:56.277 "nvme_iov_md": false 00:12:56.277 }, 00:12:56.277 "memory_domains": [ 00:12:56.277 { 00:12:56.277 "dma_device_id": "system", 00:12:56.277 "dma_device_type": 1 00:12:56.277 }, 00:12:56.277 { 00:12:56.277 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.277 "dma_device_type": 2 00:12:56.277 } 00:12:56.277 ], 00:12:56.277 "driver_specific": {} 00:12:56.277 }' 00:12:56.277 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.277 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.277 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:56.277 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.277 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.277 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:56.277 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:56.277 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:56.536 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:56.536 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.536 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.536 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:56.536 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:56.536 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:56.536 20:26:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:56.795 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:56.795 "name": "BaseBdev2", 00:12:56.795 "aliases": [ 00:12:56.795 "3d2d20b5-e9d9-4557-a445-da0a5ab2c2eb" 00:12:56.795 ], 00:12:56.795 "product_name": "Malloc disk", 00:12:56.795 "block_size": 512, 00:12:56.795 "num_blocks": 65536, 00:12:56.795 "uuid": "3d2d20b5-e9d9-4557-a445-da0a5ab2c2eb", 00:12:56.795 "assigned_rate_limits": { 00:12:56.795 "rw_ios_per_sec": 0, 00:12:56.795 "rw_mbytes_per_sec": 0, 00:12:56.795 "r_mbytes_per_sec": 0, 00:12:56.795 "w_mbytes_per_sec": 0 00:12:56.795 }, 00:12:56.795 "claimed": true, 00:12:56.795 "claim_type": "exclusive_write", 00:12:56.795 "zoned": false, 00:12:56.795 "supported_io_types": { 00:12:56.795 "read": true, 00:12:56.795 "write": true, 00:12:56.795 "unmap": true, 00:12:56.795 "flush": true, 00:12:56.795 "reset": true, 00:12:56.795 "nvme_admin": false, 00:12:56.795 "nvme_io": false, 00:12:56.795 "nvme_io_md": false, 00:12:56.795 "write_zeroes": true, 00:12:56.795 "zcopy": true, 00:12:56.795 "get_zone_info": false, 00:12:56.795 "zone_management": false, 00:12:56.795 "zone_append": false, 00:12:56.795 "compare": false, 00:12:56.795 "compare_and_write": false, 00:12:56.795 "abort": true, 00:12:56.795 "seek_hole": false, 00:12:56.795 "seek_data": false, 00:12:56.795 "copy": true, 00:12:56.795 "nvme_iov_md": false 00:12:56.795 }, 00:12:56.796 "memory_domains": [ 00:12:56.796 { 00:12:56.796 "dma_device_id": "system", 00:12:56.796 "dma_device_type": 1 00:12:56.796 }, 00:12:56.796 { 00:12:56.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.796 "dma_device_type": 2 00:12:56.796 } 00:12:56.796 ], 00:12:56.796 "driver_specific": {} 00:12:56.796 }' 00:12:56.796 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.796 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.796 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:56.796 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.796 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.053 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:57.053 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.053 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.053 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:57.053 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.053 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.053 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:57.053 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:57.310 [2024-07-15 20:26:49.578066] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.310 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:57.567 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.567 "name": "Existed_Raid", 00:12:57.567 "uuid": "26b58c5b-b210-40aa-bb4b-d3f5ebaf98a7", 00:12:57.567 "strip_size_kb": 0, 00:12:57.567 "state": "online", 00:12:57.567 "raid_level": "raid1", 00:12:57.567 "superblock": false, 00:12:57.567 "num_base_bdevs": 2, 00:12:57.567 "num_base_bdevs_discovered": 1, 00:12:57.567 "num_base_bdevs_operational": 1, 00:12:57.567 "base_bdevs_list": [ 00:12:57.567 { 00:12:57.567 "name": null, 00:12:57.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.567 "is_configured": false, 00:12:57.567 "data_offset": 0, 00:12:57.567 "data_size": 65536 00:12:57.567 }, 00:12:57.567 { 00:12:57.567 "name": "BaseBdev2", 00:12:57.567 "uuid": "3d2d20b5-e9d9-4557-a445-da0a5ab2c2eb", 00:12:57.567 "is_configured": true, 00:12:57.567 "data_offset": 0, 00:12:57.567 "data_size": 65536 00:12:57.567 } 00:12:57.567 ] 00:12:57.567 }' 00:12:57.567 20:26:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.567 20:26:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.499 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:58.499 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:58.499 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.499 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:58.499 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:58.499 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:58.499 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:58.757 [2024-07-15 20:26:50.942725] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:58.757 [2024-07-15 20:26:50.942805] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:58.757 [2024-07-15 20:26:50.955572] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:58.757 [2024-07-15 20:26:50.955613] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:58.757 [2024-07-15 20:26:50.955627] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf77000 name Existed_Raid, state offline 00:12:58.757 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:58.757 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:58.757 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.757 20:26:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1361801 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1361801 ']' 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1361801 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1361801 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1361801' 00:12:59.016 killing process with pid 1361801 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1361801 00:12:59.016 [2024-07-15 20:26:51.278841] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:59.016 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1361801 00:12:59.016 [2024-07-15 20:26:51.279815] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:59.275 00:12:59.275 real 0m11.809s 00:12:59.275 user 0m21.143s 00:12:59.275 sys 0m2.076s 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:59.275 ************************************ 00:12:59.275 END TEST raid_state_function_test 00:12:59.275 ************************************ 00:12:59.275 20:26:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:59.275 20:26:51 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:12:59.275 20:26:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:59.275 20:26:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:59.275 20:26:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:59.275 ************************************ 00:12:59.275 START TEST raid_state_function_test_sb 00:12:59.275 ************************************ 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1363604 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1363604' 00:12:59.275 Process raid pid: 1363604 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1363604 /var/tmp/spdk-raid.sock 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1363604 ']' 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:59.275 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:59.275 20:26:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:59.533 [2024-07-15 20:26:51.660407] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:12:59.533 [2024-07-15 20:26:51.660469] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:59.533 [2024-07-15 20:26:51.791666] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:59.533 [2024-07-15 20:26:51.896806] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:59.791 [2024-07-15 20:26:51.953975] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:59.791 [2024-07-15 20:26:51.954005] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:59.791 20:26:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:59.791 20:26:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:59.791 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:00.049 [2024-07-15 20:26:52.348671] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:00.049 [2024-07-15 20:26:52.348712] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:00.049 [2024-07-15 20:26:52.348723] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:00.049 [2024-07-15 20:26:52.348734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.049 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:00.307 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.307 "name": "Existed_Raid", 00:13:00.307 "uuid": "7a1655f3-f93d-4d29-a63d-6c3915bee18a", 00:13:00.307 "strip_size_kb": 0, 00:13:00.307 "state": "configuring", 00:13:00.307 "raid_level": "raid1", 00:13:00.307 "superblock": true, 00:13:00.307 "num_base_bdevs": 2, 00:13:00.307 "num_base_bdevs_discovered": 0, 00:13:00.307 "num_base_bdevs_operational": 2, 00:13:00.307 "base_bdevs_list": [ 00:13:00.307 { 00:13:00.307 "name": "BaseBdev1", 00:13:00.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.307 "is_configured": false, 00:13:00.307 "data_offset": 0, 00:13:00.307 "data_size": 0 00:13:00.307 }, 00:13:00.307 { 00:13:00.307 "name": "BaseBdev2", 00:13:00.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.307 "is_configured": false, 00:13:00.307 "data_offset": 0, 00:13:00.307 "data_size": 0 00:13:00.307 } 00:13:00.307 ] 00:13:00.307 }' 00:13:00.307 20:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.307 20:26:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:00.886 20:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:01.143 [2024-07-15 20:26:53.443442] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:01.143 [2024-07-15 20:26:53.443469] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd66a80 name Existed_Raid, state configuring 00:13:01.143 20:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:01.400 [2024-07-15 20:26:53.692126] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:01.400 [2024-07-15 20:26:53.692157] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:01.400 [2024-07-15 20:26:53.692166] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:01.400 [2024-07-15 20:26:53.692178] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:01.400 20:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:01.658 [2024-07-15 20:26:53.946713] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:01.658 BaseBdev1 00:13:01.658 20:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:01.658 20:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:01.658 20:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:01.658 20:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:01.658 20:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:01.658 20:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:01.658 20:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:01.916 20:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:02.173 [ 00:13:02.173 { 00:13:02.173 "name": "BaseBdev1", 00:13:02.173 "aliases": [ 00:13:02.173 "82c9e7c1-26e5-4710-8255-b0c1e5c75337" 00:13:02.173 ], 00:13:02.173 "product_name": "Malloc disk", 00:13:02.173 "block_size": 512, 00:13:02.173 "num_blocks": 65536, 00:13:02.173 "uuid": "82c9e7c1-26e5-4710-8255-b0c1e5c75337", 00:13:02.173 "assigned_rate_limits": { 00:13:02.173 "rw_ios_per_sec": 0, 00:13:02.173 "rw_mbytes_per_sec": 0, 00:13:02.173 "r_mbytes_per_sec": 0, 00:13:02.173 "w_mbytes_per_sec": 0 00:13:02.173 }, 00:13:02.173 "claimed": true, 00:13:02.173 "claim_type": "exclusive_write", 00:13:02.173 "zoned": false, 00:13:02.173 "supported_io_types": { 00:13:02.173 "read": true, 00:13:02.173 "write": true, 00:13:02.173 "unmap": true, 00:13:02.173 "flush": true, 00:13:02.173 "reset": true, 00:13:02.173 "nvme_admin": false, 00:13:02.173 "nvme_io": false, 00:13:02.173 "nvme_io_md": false, 00:13:02.173 "write_zeroes": true, 00:13:02.173 "zcopy": true, 00:13:02.173 "get_zone_info": false, 00:13:02.173 "zone_management": false, 00:13:02.173 "zone_append": false, 00:13:02.173 "compare": false, 00:13:02.173 "compare_and_write": false, 00:13:02.173 "abort": true, 00:13:02.173 "seek_hole": false, 00:13:02.173 "seek_data": false, 00:13:02.173 "copy": true, 00:13:02.173 "nvme_iov_md": false 00:13:02.173 }, 00:13:02.173 "memory_domains": [ 00:13:02.174 { 00:13:02.174 "dma_device_id": "system", 00:13:02.174 "dma_device_type": 1 00:13:02.174 }, 00:13:02.174 { 00:13:02.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.174 "dma_device_type": 2 00:13:02.174 } 00:13:02.174 ], 00:13:02.174 "driver_specific": {} 00:13:02.174 } 00:13:02.174 ] 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.174 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:02.431 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:02.431 "name": "Existed_Raid", 00:13:02.431 "uuid": "fffd7721-b6f1-46ae-8043-03abb9639f3a", 00:13:02.431 "strip_size_kb": 0, 00:13:02.431 "state": "configuring", 00:13:02.431 "raid_level": "raid1", 00:13:02.431 "superblock": true, 00:13:02.431 "num_base_bdevs": 2, 00:13:02.431 "num_base_bdevs_discovered": 1, 00:13:02.431 "num_base_bdevs_operational": 2, 00:13:02.431 "base_bdevs_list": [ 00:13:02.431 { 00:13:02.431 "name": "BaseBdev1", 00:13:02.431 "uuid": "82c9e7c1-26e5-4710-8255-b0c1e5c75337", 00:13:02.431 "is_configured": true, 00:13:02.431 "data_offset": 2048, 00:13:02.431 "data_size": 63488 00:13:02.431 }, 00:13:02.431 { 00:13:02.431 "name": "BaseBdev2", 00:13:02.431 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:02.431 "is_configured": false, 00:13:02.431 "data_offset": 0, 00:13:02.431 "data_size": 0 00:13:02.431 } 00:13:02.431 ] 00:13:02.431 }' 00:13:02.431 20:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:02.431 20:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:02.996 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:03.253 [2024-07-15 20:26:55.534912] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:03.253 [2024-07-15 20:26:55.534956] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd66350 name Existed_Raid, state configuring 00:13:03.253 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:03.510 [2024-07-15 20:26:55.783615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:03.510 [2024-07-15 20:26:55.785102] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:03.510 [2024-07-15 20:26:55.785133] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.510 20:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:03.768 20:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.768 "name": "Existed_Raid", 00:13:03.768 "uuid": "e0246565-5734-45d9-8256-3da4ff31907b", 00:13:03.768 "strip_size_kb": 0, 00:13:03.768 "state": "configuring", 00:13:03.768 "raid_level": "raid1", 00:13:03.768 "superblock": true, 00:13:03.768 "num_base_bdevs": 2, 00:13:03.768 "num_base_bdevs_discovered": 1, 00:13:03.768 "num_base_bdevs_operational": 2, 00:13:03.768 "base_bdevs_list": [ 00:13:03.768 { 00:13:03.768 "name": "BaseBdev1", 00:13:03.768 "uuid": "82c9e7c1-26e5-4710-8255-b0c1e5c75337", 00:13:03.768 "is_configured": true, 00:13:03.768 "data_offset": 2048, 00:13:03.768 "data_size": 63488 00:13:03.768 }, 00:13:03.768 { 00:13:03.768 "name": "BaseBdev2", 00:13:03.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:03.768 "is_configured": false, 00:13:03.768 "data_offset": 0, 00:13:03.768 "data_size": 0 00:13:03.768 } 00:13:03.768 ] 00:13:03.768 }' 00:13:03.768 20:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.768 20:26:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:04.333 20:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:04.591 [2024-07-15 20:26:56.919150] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:04.591 [2024-07-15 20:26:56.919317] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd67000 00:13:04.591 [2024-07-15 20:26:56.919330] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:04.591 [2024-07-15 20:26:56.919500] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc810c0 00:13:04.591 [2024-07-15 20:26:56.919620] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd67000 00:13:04.591 [2024-07-15 20:26:56.919630] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd67000 00:13:04.591 [2024-07-15 20:26:56.919722] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:04.591 BaseBdev2 00:13:04.591 20:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:04.591 20:26:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:04.591 20:26:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:04.591 20:26:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:04.591 20:26:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:04.591 20:26:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:04.591 20:26:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:04.848 20:26:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:05.415 [ 00:13:05.415 { 00:13:05.415 "name": "BaseBdev2", 00:13:05.415 "aliases": [ 00:13:05.415 "176e3f24-f560-4d3b-8511-51e1b8f7b626" 00:13:05.415 ], 00:13:05.415 "product_name": "Malloc disk", 00:13:05.415 "block_size": 512, 00:13:05.415 "num_blocks": 65536, 00:13:05.415 "uuid": "176e3f24-f560-4d3b-8511-51e1b8f7b626", 00:13:05.415 "assigned_rate_limits": { 00:13:05.415 "rw_ios_per_sec": 0, 00:13:05.415 "rw_mbytes_per_sec": 0, 00:13:05.415 "r_mbytes_per_sec": 0, 00:13:05.415 "w_mbytes_per_sec": 0 00:13:05.415 }, 00:13:05.415 "claimed": true, 00:13:05.415 "claim_type": "exclusive_write", 00:13:05.415 "zoned": false, 00:13:05.415 "supported_io_types": { 00:13:05.415 "read": true, 00:13:05.415 "write": true, 00:13:05.415 "unmap": true, 00:13:05.415 "flush": true, 00:13:05.415 "reset": true, 00:13:05.415 "nvme_admin": false, 00:13:05.415 "nvme_io": false, 00:13:05.415 "nvme_io_md": false, 00:13:05.415 "write_zeroes": true, 00:13:05.415 "zcopy": true, 00:13:05.415 "get_zone_info": false, 00:13:05.415 "zone_management": false, 00:13:05.415 "zone_append": false, 00:13:05.415 "compare": false, 00:13:05.415 "compare_and_write": false, 00:13:05.415 "abort": true, 00:13:05.415 "seek_hole": false, 00:13:05.415 "seek_data": false, 00:13:05.415 "copy": true, 00:13:05.415 "nvme_iov_md": false 00:13:05.415 }, 00:13:05.415 "memory_domains": [ 00:13:05.415 { 00:13:05.415 "dma_device_id": "system", 00:13:05.415 "dma_device_type": 1 00:13:05.415 }, 00:13:05.415 { 00:13:05.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.415 "dma_device_type": 2 00:13:05.415 } 00:13:05.415 ], 00:13:05.415 "driver_specific": {} 00:13:05.415 } 00:13:05.415 ] 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.415 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:05.674 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.674 "name": "Existed_Raid", 00:13:05.674 "uuid": "e0246565-5734-45d9-8256-3da4ff31907b", 00:13:05.674 "strip_size_kb": 0, 00:13:05.674 "state": "online", 00:13:05.674 "raid_level": "raid1", 00:13:05.674 "superblock": true, 00:13:05.674 "num_base_bdevs": 2, 00:13:05.674 "num_base_bdevs_discovered": 2, 00:13:05.674 "num_base_bdevs_operational": 2, 00:13:05.674 "base_bdevs_list": [ 00:13:05.674 { 00:13:05.674 "name": "BaseBdev1", 00:13:05.674 "uuid": "82c9e7c1-26e5-4710-8255-b0c1e5c75337", 00:13:05.674 "is_configured": true, 00:13:05.674 "data_offset": 2048, 00:13:05.674 "data_size": 63488 00:13:05.674 }, 00:13:05.674 { 00:13:05.674 "name": "BaseBdev2", 00:13:05.674 "uuid": "176e3f24-f560-4d3b-8511-51e1b8f7b626", 00:13:05.674 "is_configured": true, 00:13:05.674 "data_offset": 2048, 00:13:05.674 "data_size": 63488 00:13:05.674 } 00:13:05.674 ] 00:13:05.674 }' 00:13:05.674 20:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.674 20:26:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:06.611 20:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:06.611 20:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:06.611 20:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:06.611 20:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:06.611 20:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:06.611 20:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:06.611 20:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:06.611 20:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:06.611 [2024-07-15 20:26:58.988924] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:06.871 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:06.871 "name": "Existed_Raid", 00:13:06.871 "aliases": [ 00:13:06.871 "e0246565-5734-45d9-8256-3da4ff31907b" 00:13:06.871 ], 00:13:06.871 "product_name": "Raid Volume", 00:13:06.871 "block_size": 512, 00:13:06.871 "num_blocks": 63488, 00:13:06.871 "uuid": "e0246565-5734-45d9-8256-3da4ff31907b", 00:13:06.871 "assigned_rate_limits": { 00:13:06.871 "rw_ios_per_sec": 0, 00:13:06.871 "rw_mbytes_per_sec": 0, 00:13:06.871 "r_mbytes_per_sec": 0, 00:13:06.871 "w_mbytes_per_sec": 0 00:13:06.871 }, 00:13:06.871 "claimed": false, 00:13:06.871 "zoned": false, 00:13:06.871 "supported_io_types": { 00:13:06.871 "read": true, 00:13:06.871 "write": true, 00:13:06.871 "unmap": false, 00:13:06.871 "flush": false, 00:13:06.871 "reset": true, 00:13:06.871 "nvme_admin": false, 00:13:06.871 "nvme_io": false, 00:13:06.871 "nvme_io_md": false, 00:13:06.871 "write_zeroes": true, 00:13:06.871 "zcopy": false, 00:13:06.871 "get_zone_info": false, 00:13:06.871 "zone_management": false, 00:13:06.871 "zone_append": false, 00:13:06.871 "compare": false, 00:13:06.871 "compare_and_write": false, 00:13:06.871 "abort": false, 00:13:06.871 "seek_hole": false, 00:13:06.871 "seek_data": false, 00:13:06.871 "copy": false, 00:13:06.871 "nvme_iov_md": false 00:13:06.871 }, 00:13:06.871 "memory_domains": [ 00:13:06.871 { 00:13:06.871 "dma_device_id": "system", 00:13:06.871 "dma_device_type": 1 00:13:06.871 }, 00:13:06.871 { 00:13:06.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.871 "dma_device_type": 2 00:13:06.871 }, 00:13:06.871 { 00:13:06.871 "dma_device_id": "system", 00:13:06.871 "dma_device_type": 1 00:13:06.871 }, 00:13:06.871 { 00:13:06.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.871 "dma_device_type": 2 00:13:06.871 } 00:13:06.871 ], 00:13:06.871 "driver_specific": { 00:13:06.871 "raid": { 00:13:06.871 "uuid": "e0246565-5734-45d9-8256-3da4ff31907b", 00:13:06.871 "strip_size_kb": 0, 00:13:06.871 "state": "online", 00:13:06.871 "raid_level": "raid1", 00:13:06.871 "superblock": true, 00:13:06.871 "num_base_bdevs": 2, 00:13:06.871 "num_base_bdevs_discovered": 2, 00:13:06.871 "num_base_bdevs_operational": 2, 00:13:06.871 "base_bdevs_list": [ 00:13:06.871 { 00:13:06.871 "name": "BaseBdev1", 00:13:06.871 "uuid": "82c9e7c1-26e5-4710-8255-b0c1e5c75337", 00:13:06.871 "is_configured": true, 00:13:06.871 "data_offset": 2048, 00:13:06.871 "data_size": 63488 00:13:06.871 }, 00:13:06.871 { 00:13:06.871 "name": "BaseBdev2", 00:13:06.871 "uuid": "176e3f24-f560-4d3b-8511-51e1b8f7b626", 00:13:06.871 "is_configured": true, 00:13:06.871 "data_offset": 2048, 00:13:06.871 "data_size": 63488 00:13:06.871 } 00:13:06.871 ] 00:13:06.871 } 00:13:06.871 } 00:13:06.871 }' 00:13:06.871 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:06.871 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:06.871 BaseBdev2' 00:13:06.871 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:06.871 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:06.871 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:07.440 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:07.440 "name": "BaseBdev1", 00:13:07.440 "aliases": [ 00:13:07.440 "82c9e7c1-26e5-4710-8255-b0c1e5c75337" 00:13:07.440 ], 00:13:07.440 "product_name": "Malloc disk", 00:13:07.440 "block_size": 512, 00:13:07.440 "num_blocks": 65536, 00:13:07.440 "uuid": "82c9e7c1-26e5-4710-8255-b0c1e5c75337", 00:13:07.440 "assigned_rate_limits": { 00:13:07.440 "rw_ios_per_sec": 0, 00:13:07.440 "rw_mbytes_per_sec": 0, 00:13:07.440 "r_mbytes_per_sec": 0, 00:13:07.440 "w_mbytes_per_sec": 0 00:13:07.440 }, 00:13:07.440 "claimed": true, 00:13:07.440 "claim_type": "exclusive_write", 00:13:07.440 "zoned": false, 00:13:07.440 "supported_io_types": { 00:13:07.440 "read": true, 00:13:07.440 "write": true, 00:13:07.440 "unmap": true, 00:13:07.440 "flush": true, 00:13:07.440 "reset": true, 00:13:07.440 "nvme_admin": false, 00:13:07.440 "nvme_io": false, 00:13:07.440 "nvme_io_md": false, 00:13:07.440 "write_zeroes": true, 00:13:07.440 "zcopy": true, 00:13:07.440 "get_zone_info": false, 00:13:07.440 "zone_management": false, 00:13:07.440 "zone_append": false, 00:13:07.440 "compare": false, 00:13:07.440 "compare_and_write": false, 00:13:07.440 "abort": true, 00:13:07.440 "seek_hole": false, 00:13:07.440 "seek_data": false, 00:13:07.440 "copy": true, 00:13:07.440 "nvme_iov_md": false 00:13:07.440 }, 00:13:07.440 "memory_domains": [ 00:13:07.440 { 00:13:07.440 "dma_device_id": "system", 00:13:07.440 "dma_device_type": 1 00:13:07.440 }, 00:13:07.440 { 00:13:07.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:07.440 "dma_device_type": 2 00:13:07.440 } 00:13:07.440 ], 00:13:07.440 "driver_specific": {} 00:13:07.440 }' 00:13:07.440 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:07.440 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:07.440 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:07.440 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:07.440 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:07.698 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:07.698 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:07.698 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:07.698 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:07.699 20:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:07.699 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:07.699 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:07.699 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:07.699 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:07.699 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:08.266 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:08.266 "name": "BaseBdev2", 00:13:08.266 "aliases": [ 00:13:08.266 "176e3f24-f560-4d3b-8511-51e1b8f7b626" 00:13:08.266 ], 00:13:08.266 "product_name": "Malloc disk", 00:13:08.266 "block_size": 512, 00:13:08.266 "num_blocks": 65536, 00:13:08.266 "uuid": "176e3f24-f560-4d3b-8511-51e1b8f7b626", 00:13:08.266 "assigned_rate_limits": { 00:13:08.266 "rw_ios_per_sec": 0, 00:13:08.266 "rw_mbytes_per_sec": 0, 00:13:08.266 "r_mbytes_per_sec": 0, 00:13:08.266 "w_mbytes_per_sec": 0 00:13:08.266 }, 00:13:08.266 "claimed": true, 00:13:08.266 "claim_type": "exclusive_write", 00:13:08.266 "zoned": false, 00:13:08.266 "supported_io_types": { 00:13:08.266 "read": true, 00:13:08.266 "write": true, 00:13:08.266 "unmap": true, 00:13:08.266 "flush": true, 00:13:08.266 "reset": true, 00:13:08.266 "nvme_admin": false, 00:13:08.266 "nvme_io": false, 00:13:08.266 "nvme_io_md": false, 00:13:08.266 "write_zeroes": true, 00:13:08.266 "zcopy": true, 00:13:08.266 "get_zone_info": false, 00:13:08.266 "zone_management": false, 00:13:08.266 "zone_append": false, 00:13:08.266 "compare": false, 00:13:08.266 "compare_and_write": false, 00:13:08.266 "abort": true, 00:13:08.266 "seek_hole": false, 00:13:08.266 "seek_data": false, 00:13:08.266 "copy": true, 00:13:08.266 "nvme_iov_md": false 00:13:08.266 }, 00:13:08.266 "memory_domains": [ 00:13:08.266 { 00:13:08.266 "dma_device_id": "system", 00:13:08.266 "dma_device_type": 1 00:13:08.266 }, 00:13:08.266 { 00:13:08.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:08.266 "dma_device_type": 2 00:13:08.266 } 00:13:08.266 ], 00:13:08.266 "driver_specific": {} 00:13:08.266 }' 00:13:08.266 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:08.266 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:08.266 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:08.266 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:08.266 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:08.266 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:08.266 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:08.266 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:08.525 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:08.525 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:08.525 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:08.525 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:08.525 20:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:08.784 [2024-07-15 20:27:01.038160] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.785 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:09.353 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.353 "name": "Existed_Raid", 00:13:09.353 "uuid": "e0246565-5734-45d9-8256-3da4ff31907b", 00:13:09.353 "strip_size_kb": 0, 00:13:09.353 "state": "online", 00:13:09.353 "raid_level": "raid1", 00:13:09.353 "superblock": true, 00:13:09.353 "num_base_bdevs": 2, 00:13:09.353 "num_base_bdevs_discovered": 1, 00:13:09.353 "num_base_bdevs_operational": 1, 00:13:09.353 "base_bdevs_list": [ 00:13:09.353 { 00:13:09.353 "name": null, 00:13:09.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.353 "is_configured": false, 00:13:09.353 "data_offset": 2048, 00:13:09.353 "data_size": 63488 00:13:09.353 }, 00:13:09.353 { 00:13:09.353 "name": "BaseBdev2", 00:13:09.353 "uuid": "176e3f24-f560-4d3b-8511-51e1b8f7b626", 00:13:09.353 "is_configured": true, 00:13:09.353 "data_offset": 2048, 00:13:09.353 "data_size": 63488 00:13:09.353 } 00:13:09.353 ] 00:13:09.353 }' 00:13:09.353 20:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.353 20:27:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:09.920 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:09.920 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:09.920 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.920 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:10.487 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:10.487 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:10.487 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:10.746 [2024-07-15 20:27:02.920802] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:10.746 [2024-07-15 20:27:02.920890] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:10.746 [2024-07-15 20:27:02.933375] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:10.746 [2024-07-15 20:27:02.933412] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:10.746 [2024-07-15 20:27:02.933425] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd67000 name Existed_Raid, state offline 00:13:10.746 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:10.746 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:10.746 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.746 20:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1363604 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1363604 ']' 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1363604 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1363604 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1363604' 00:13:11.317 killing process with pid 1363604 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1363604 00:13:11.317 [2024-07-15 20:27:03.519792] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:11.317 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1363604 00:13:11.317 [2024-07-15 20:27:03.520650] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:11.575 20:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:11.575 00:13:11.575 real 0m12.134s 00:13:11.575 user 0m22.247s 00:13:11.575 sys 0m2.193s 00:13:11.575 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:11.575 20:27:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:11.575 ************************************ 00:13:11.576 END TEST raid_state_function_test_sb 00:13:11.576 ************************************ 00:13:11.576 20:27:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:11.576 20:27:03 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:13:11.576 20:27:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:11.576 20:27:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:11.576 20:27:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:11.576 ************************************ 00:13:11.576 START TEST raid_superblock_test 00:13:11.576 ************************************ 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1365536 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1365536 /var/tmp/spdk-raid.sock 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1365536 ']' 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:11.576 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:11.576 20:27:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.576 [2024-07-15 20:27:03.868325] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:13:11.576 [2024-07-15 20:27:03.868388] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1365536 ] 00:13:11.835 [2024-07-15 20:27:03.997475] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:11.835 [2024-07-15 20:27:04.105500] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:11.835 [2024-07-15 20:27:04.172172] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:11.835 [2024-07-15 20:27:04.172214] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:12.094 20:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:12.353 malloc1 00:13:12.353 20:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:12.921 [2024-07-15 20:27:05.115339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:12.921 [2024-07-15 20:27:05.115389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:12.921 [2024-07-15 20:27:05.115413] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c33570 00:13:12.921 [2024-07-15 20:27:05.115426] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:12.921 [2024-07-15 20:27:05.117139] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:12.921 [2024-07-15 20:27:05.117167] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:12.921 pt1 00:13:12.921 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:12.921 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:12.921 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:12.921 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:12.921 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:12.921 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:12.921 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:12.921 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:12.921 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:13.180 malloc2 00:13:13.180 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:13.749 [2024-07-15 20:27:05.926445] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:13.749 [2024-07-15 20:27:05.926495] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:13.749 [2024-07-15 20:27:05.926514] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c34970 00:13:13.749 [2024-07-15 20:27:05.926526] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:13.749 [2024-07-15 20:27:05.928201] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:13.749 [2024-07-15 20:27:05.928229] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:13.749 pt2 00:13:13.749 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:13.749 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:13.749 20:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:13:14.009 [2024-07-15 20:27:06.231255] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:14.009 [2024-07-15 20:27:06.232591] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:14.009 [2024-07-15 20:27:06.232742] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1dd7270 00:13:14.009 [2024-07-15 20:27:06.232756] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:14.009 [2024-07-15 20:27:06.232967] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c2b0e0 00:13:14.009 [2024-07-15 20:27:06.233112] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1dd7270 00:13:14.009 [2024-07-15 20:27:06.233122] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1dd7270 00:13:14.009 [2024-07-15 20:27:06.233219] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.009 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:14.578 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:14.578 "name": "raid_bdev1", 00:13:14.578 "uuid": "d98bddab-4d8c-4410-8511-f14a3a07202f", 00:13:14.578 "strip_size_kb": 0, 00:13:14.578 "state": "online", 00:13:14.578 "raid_level": "raid1", 00:13:14.578 "superblock": true, 00:13:14.578 "num_base_bdevs": 2, 00:13:14.578 "num_base_bdevs_discovered": 2, 00:13:14.578 "num_base_bdevs_operational": 2, 00:13:14.578 "base_bdevs_list": [ 00:13:14.578 { 00:13:14.578 "name": "pt1", 00:13:14.578 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:14.578 "is_configured": true, 00:13:14.578 "data_offset": 2048, 00:13:14.578 "data_size": 63488 00:13:14.578 }, 00:13:14.578 { 00:13:14.578 "name": "pt2", 00:13:14.578 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:14.578 "is_configured": true, 00:13:14.578 "data_offset": 2048, 00:13:14.578 "data_size": 63488 00:13:14.578 } 00:13:14.578 ] 00:13:14.578 }' 00:13:14.578 20:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:14.578 20:27:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:15.567 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:15.568 [2024-07-15 20:27:07.787603] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:15.568 "name": "raid_bdev1", 00:13:15.568 "aliases": [ 00:13:15.568 "d98bddab-4d8c-4410-8511-f14a3a07202f" 00:13:15.568 ], 00:13:15.568 "product_name": "Raid Volume", 00:13:15.568 "block_size": 512, 00:13:15.568 "num_blocks": 63488, 00:13:15.568 "uuid": "d98bddab-4d8c-4410-8511-f14a3a07202f", 00:13:15.568 "assigned_rate_limits": { 00:13:15.568 "rw_ios_per_sec": 0, 00:13:15.568 "rw_mbytes_per_sec": 0, 00:13:15.568 "r_mbytes_per_sec": 0, 00:13:15.568 "w_mbytes_per_sec": 0 00:13:15.568 }, 00:13:15.568 "claimed": false, 00:13:15.568 "zoned": false, 00:13:15.568 "supported_io_types": { 00:13:15.568 "read": true, 00:13:15.568 "write": true, 00:13:15.568 "unmap": false, 00:13:15.568 "flush": false, 00:13:15.568 "reset": true, 00:13:15.568 "nvme_admin": false, 00:13:15.568 "nvme_io": false, 00:13:15.568 "nvme_io_md": false, 00:13:15.568 "write_zeroes": true, 00:13:15.568 "zcopy": false, 00:13:15.568 "get_zone_info": false, 00:13:15.568 "zone_management": false, 00:13:15.568 "zone_append": false, 00:13:15.568 "compare": false, 00:13:15.568 "compare_and_write": false, 00:13:15.568 "abort": false, 00:13:15.568 "seek_hole": false, 00:13:15.568 "seek_data": false, 00:13:15.568 "copy": false, 00:13:15.568 "nvme_iov_md": false 00:13:15.568 }, 00:13:15.568 "memory_domains": [ 00:13:15.568 { 00:13:15.568 "dma_device_id": "system", 00:13:15.568 "dma_device_type": 1 00:13:15.568 }, 00:13:15.568 { 00:13:15.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.568 "dma_device_type": 2 00:13:15.568 }, 00:13:15.568 { 00:13:15.568 "dma_device_id": "system", 00:13:15.568 "dma_device_type": 1 00:13:15.568 }, 00:13:15.568 { 00:13:15.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.568 "dma_device_type": 2 00:13:15.568 } 00:13:15.568 ], 00:13:15.568 "driver_specific": { 00:13:15.568 "raid": { 00:13:15.568 "uuid": "d98bddab-4d8c-4410-8511-f14a3a07202f", 00:13:15.568 "strip_size_kb": 0, 00:13:15.568 "state": "online", 00:13:15.568 "raid_level": "raid1", 00:13:15.568 "superblock": true, 00:13:15.568 "num_base_bdevs": 2, 00:13:15.568 "num_base_bdevs_discovered": 2, 00:13:15.568 "num_base_bdevs_operational": 2, 00:13:15.568 "base_bdevs_list": [ 00:13:15.568 { 00:13:15.568 "name": "pt1", 00:13:15.568 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:15.568 "is_configured": true, 00:13:15.568 "data_offset": 2048, 00:13:15.568 "data_size": 63488 00:13:15.568 }, 00:13:15.568 { 00:13:15.568 "name": "pt2", 00:13:15.568 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:15.568 "is_configured": true, 00:13:15.568 "data_offset": 2048, 00:13:15.568 "data_size": 63488 00:13:15.568 } 00:13:15.568 ] 00:13:15.568 } 00:13:15.568 } 00:13:15.568 }' 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:15.568 pt2' 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:15.568 20:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:16.135 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:16.135 "name": "pt1", 00:13:16.135 "aliases": [ 00:13:16.135 "00000000-0000-0000-0000-000000000001" 00:13:16.135 ], 00:13:16.135 "product_name": "passthru", 00:13:16.135 "block_size": 512, 00:13:16.135 "num_blocks": 65536, 00:13:16.135 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:16.135 "assigned_rate_limits": { 00:13:16.135 "rw_ios_per_sec": 0, 00:13:16.135 "rw_mbytes_per_sec": 0, 00:13:16.135 "r_mbytes_per_sec": 0, 00:13:16.135 "w_mbytes_per_sec": 0 00:13:16.135 }, 00:13:16.135 "claimed": true, 00:13:16.135 "claim_type": "exclusive_write", 00:13:16.135 "zoned": false, 00:13:16.135 "supported_io_types": { 00:13:16.135 "read": true, 00:13:16.135 "write": true, 00:13:16.135 "unmap": true, 00:13:16.135 "flush": true, 00:13:16.135 "reset": true, 00:13:16.135 "nvme_admin": false, 00:13:16.135 "nvme_io": false, 00:13:16.135 "nvme_io_md": false, 00:13:16.135 "write_zeroes": true, 00:13:16.135 "zcopy": true, 00:13:16.135 "get_zone_info": false, 00:13:16.135 "zone_management": false, 00:13:16.135 "zone_append": false, 00:13:16.135 "compare": false, 00:13:16.135 "compare_and_write": false, 00:13:16.135 "abort": true, 00:13:16.135 "seek_hole": false, 00:13:16.135 "seek_data": false, 00:13:16.135 "copy": true, 00:13:16.135 "nvme_iov_md": false 00:13:16.135 }, 00:13:16.135 "memory_domains": [ 00:13:16.135 { 00:13:16.135 "dma_device_id": "system", 00:13:16.135 "dma_device_type": 1 00:13:16.135 }, 00:13:16.135 { 00:13:16.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.135 "dma_device_type": 2 00:13:16.135 } 00:13:16.135 ], 00:13:16.135 "driver_specific": { 00:13:16.135 "passthru": { 00:13:16.135 "name": "pt1", 00:13:16.135 "base_bdev_name": "malloc1" 00:13:16.135 } 00:13:16.135 } 00:13:16.135 }' 00:13:16.135 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.135 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.393 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:16.393 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.393 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:16.393 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:16.393 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.393 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:16.650 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:16.650 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.650 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:16.650 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:16.650 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:16.650 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:16.650 20:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:16.908 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:16.908 "name": "pt2", 00:13:16.908 "aliases": [ 00:13:16.908 "00000000-0000-0000-0000-000000000002" 00:13:16.908 ], 00:13:16.908 "product_name": "passthru", 00:13:16.908 "block_size": 512, 00:13:16.908 "num_blocks": 65536, 00:13:16.908 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:16.908 "assigned_rate_limits": { 00:13:16.908 "rw_ios_per_sec": 0, 00:13:16.908 "rw_mbytes_per_sec": 0, 00:13:16.908 "r_mbytes_per_sec": 0, 00:13:16.908 "w_mbytes_per_sec": 0 00:13:16.908 }, 00:13:16.908 "claimed": true, 00:13:16.908 "claim_type": "exclusive_write", 00:13:16.908 "zoned": false, 00:13:16.908 "supported_io_types": { 00:13:16.908 "read": true, 00:13:16.908 "write": true, 00:13:16.908 "unmap": true, 00:13:16.908 "flush": true, 00:13:16.908 "reset": true, 00:13:16.908 "nvme_admin": false, 00:13:16.908 "nvme_io": false, 00:13:16.908 "nvme_io_md": false, 00:13:16.908 "write_zeroes": true, 00:13:16.908 "zcopy": true, 00:13:16.908 "get_zone_info": false, 00:13:16.908 "zone_management": false, 00:13:16.908 "zone_append": false, 00:13:16.908 "compare": false, 00:13:16.908 "compare_and_write": false, 00:13:16.908 "abort": true, 00:13:16.908 "seek_hole": false, 00:13:16.908 "seek_data": false, 00:13:16.908 "copy": true, 00:13:16.908 "nvme_iov_md": false 00:13:16.908 }, 00:13:16.908 "memory_domains": [ 00:13:16.908 { 00:13:16.908 "dma_device_id": "system", 00:13:16.908 "dma_device_type": 1 00:13:16.908 }, 00:13:16.908 { 00:13:16.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.908 "dma_device_type": 2 00:13:16.908 } 00:13:16.908 ], 00:13:16.908 "driver_specific": { 00:13:16.908 "passthru": { 00:13:16.908 "name": "pt2", 00:13:16.908 "base_bdev_name": "malloc2" 00:13:16.908 } 00:13:16.908 } 00:13:16.908 }' 00:13:16.908 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.908 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:16.908 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:16.908 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.270 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.270 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:17.270 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.270 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.270 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:17.270 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.270 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.270 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:17.270 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:17.270 20:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:17.836 [2024-07-15 20:27:10.121794] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:17.836 20:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d98bddab-4d8c-4410-8511-f14a3a07202f 00:13:17.836 20:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d98bddab-4d8c-4410-8511-f14a3a07202f ']' 00:13:17.836 20:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:18.403 [2024-07-15 20:27:10.638900] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:18.403 [2024-07-15 20:27:10.638923] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:18.403 [2024-07-15 20:27:10.638981] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:18.403 [2024-07-15 20:27:10.639038] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:18.403 [2024-07-15 20:27:10.639051] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dd7270 name raid_bdev1, state offline 00:13:18.403 20:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.404 20:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:18.970 20:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:18.970 20:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:18.970 20:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:18.970 20:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:19.229 20:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:19.229 20:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:19.796 20:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:19.796 20:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:19.796 20:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:19.796 20:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:19.796 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:19.796 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:19.796 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:19.796 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:19.796 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:19.796 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:19.796 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:19.796 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:19.797 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:19.797 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:19.797 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:20.055 [2024-07-15 20:27:12.379418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:20.055 [2024-07-15 20:27:12.380752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:20.055 [2024-07-15 20:27:12.380804] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:20.055 [2024-07-15 20:27:12.380846] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:20.055 [2024-07-15 20:27:12.380865] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:20.055 [2024-07-15 20:27:12.380874] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dd6ff0 name raid_bdev1, state configuring 00:13:20.055 request: 00:13:20.055 { 00:13:20.055 "name": "raid_bdev1", 00:13:20.055 "raid_level": "raid1", 00:13:20.055 "base_bdevs": [ 00:13:20.055 "malloc1", 00:13:20.055 "malloc2" 00:13:20.055 ], 00:13:20.055 "superblock": false, 00:13:20.055 "method": "bdev_raid_create", 00:13:20.055 "req_id": 1 00:13:20.055 } 00:13:20.055 Got JSON-RPC error response 00:13:20.055 response: 00:13:20.055 { 00:13:20.055 "code": -17, 00:13:20.055 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:20.055 } 00:13:20.055 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:20.055 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:20.055 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:20.055 20:27:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:20.055 20:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.055 20:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:20.622 20:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:20.622 20:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:20.622 20:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:20.880 [2024-07-15 20:27:13.157406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:20.880 [2024-07-15 20:27:13.157448] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:20.880 [2024-07-15 20:27:13.157468] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c337a0 00:13:20.880 [2024-07-15 20:27:13.157480] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:20.880 [2024-07-15 20:27:13.159058] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:20.880 [2024-07-15 20:27:13.159084] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:20.880 [2024-07-15 20:27:13.159146] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:20.880 [2024-07-15 20:27:13.159170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:20.880 pt1 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.880 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:21.174 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.174 "name": "raid_bdev1", 00:13:21.174 "uuid": "d98bddab-4d8c-4410-8511-f14a3a07202f", 00:13:21.174 "strip_size_kb": 0, 00:13:21.174 "state": "configuring", 00:13:21.174 "raid_level": "raid1", 00:13:21.174 "superblock": true, 00:13:21.174 "num_base_bdevs": 2, 00:13:21.174 "num_base_bdevs_discovered": 1, 00:13:21.174 "num_base_bdevs_operational": 2, 00:13:21.174 "base_bdevs_list": [ 00:13:21.174 { 00:13:21.174 "name": "pt1", 00:13:21.174 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:21.174 "is_configured": true, 00:13:21.174 "data_offset": 2048, 00:13:21.174 "data_size": 63488 00:13:21.174 }, 00:13:21.174 { 00:13:21.174 "name": null, 00:13:21.174 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:21.174 "is_configured": false, 00:13:21.174 "data_offset": 2048, 00:13:21.174 "data_size": 63488 00:13:21.174 } 00:13:21.174 ] 00:13:21.174 }' 00:13:21.174 20:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.174 20:27:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.106 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:13:22.106 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:22.106 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:22.106 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:22.365 [2024-07-15 20:27:14.685454] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:22.365 [2024-07-15 20:27:14.685503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:22.365 [2024-07-15 20:27:14.685522] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dcb6f0 00:13:22.365 [2024-07-15 20:27:14.685535] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:22.365 [2024-07-15 20:27:14.685893] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:22.365 [2024-07-15 20:27:14.685910] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:22.365 [2024-07-15 20:27:14.685981] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:22.365 [2024-07-15 20:27:14.686001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:22.365 [2024-07-15 20:27:14.686103] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1dcc590 00:13:22.365 [2024-07-15 20:27:14.686114] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:22.365 [2024-07-15 20:27:14.686284] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c2d540 00:13:22.365 [2024-07-15 20:27:14.686409] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1dcc590 00:13:22.365 [2024-07-15 20:27:14.686419] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1dcc590 00:13:22.365 [2024-07-15 20:27:14.686513] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:22.365 pt2 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.365 20:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:22.930 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.930 "name": "raid_bdev1", 00:13:22.930 "uuid": "d98bddab-4d8c-4410-8511-f14a3a07202f", 00:13:22.930 "strip_size_kb": 0, 00:13:22.930 "state": "online", 00:13:22.930 "raid_level": "raid1", 00:13:22.930 "superblock": true, 00:13:22.930 "num_base_bdevs": 2, 00:13:22.930 "num_base_bdevs_discovered": 2, 00:13:22.930 "num_base_bdevs_operational": 2, 00:13:22.930 "base_bdevs_list": [ 00:13:22.930 { 00:13:22.930 "name": "pt1", 00:13:22.930 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:22.930 "is_configured": true, 00:13:22.930 "data_offset": 2048, 00:13:22.930 "data_size": 63488 00:13:22.930 }, 00:13:22.930 { 00:13:22.930 "name": "pt2", 00:13:22.930 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:22.930 "is_configured": true, 00:13:22.930 "data_offset": 2048, 00:13:22.930 "data_size": 63488 00:13:22.930 } 00:13:22.930 ] 00:13:22.930 }' 00:13:22.930 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.930 20:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:23.497 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:23.497 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:23.497 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:23.497 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:23.497 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:23.497 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:23.497 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:23.497 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:23.755 [2024-07-15 20:27:15.961248] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:23.755 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:23.755 "name": "raid_bdev1", 00:13:23.755 "aliases": [ 00:13:23.755 "d98bddab-4d8c-4410-8511-f14a3a07202f" 00:13:23.755 ], 00:13:23.755 "product_name": "Raid Volume", 00:13:23.755 "block_size": 512, 00:13:23.755 "num_blocks": 63488, 00:13:23.755 "uuid": "d98bddab-4d8c-4410-8511-f14a3a07202f", 00:13:23.755 "assigned_rate_limits": { 00:13:23.755 "rw_ios_per_sec": 0, 00:13:23.755 "rw_mbytes_per_sec": 0, 00:13:23.755 "r_mbytes_per_sec": 0, 00:13:23.755 "w_mbytes_per_sec": 0 00:13:23.755 }, 00:13:23.755 "claimed": false, 00:13:23.755 "zoned": false, 00:13:23.755 "supported_io_types": { 00:13:23.755 "read": true, 00:13:23.755 "write": true, 00:13:23.755 "unmap": false, 00:13:23.755 "flush": false, 00:13:23.755 "reset": true, 00:13:23.755 "nvme_admin": false, 00:13:23.755 "nvme_io": false, 00:13:23.755 "nvme_io_md": false, 00:13:23.755 "write_zeroes": true, 00:13:23.755 "zcopy": false, 00:13:23.755 "get_zone_info": false, 00:13:23.755 "zone_management": false, 00:13:23.755 "zone_append": false, 00:13:23.755 "compare": false, 00:13:23.755 "compare_and_write": false, 00:13:23.755 "abort": false, 00:13:23.755 "seek_hole": false, 00:13:23.755 "seek_data": false, 00:13:23.755 "copy": false, 00:13:23.755 "nvme_iov_md": false 00:13:23.755 }, 00:13:23.755 "memory_domains": [ 00:13:23.755 { 00:13:23.755 "dma_device_id": "system", 00:13:23.755 "dma_device_type": 1 00:13:23.755 }, 00:13:23.755 { 00:13:23.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.755 "dma_device_type": 2 00:13:23.755 }, 00:13:23.755 { 00:13:23.755 "dma_device_id": "system", 00:13:23.755 "dma_device_type": 1 00:13:23.755 }, 00:13:23.755 { 00:13:23.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.755 "dma_device_type": 2 00:13:23.755 } 00:13:23.755 ], 00:13:23.755 "driver_specific": { 00:13:23.755 "raid": { 00:13:23.755 "uuid": "d98bddab-4d8c-4410-8511-f14a3a07202f", 00:13:23.755 "strip_size_kb": 0, 00:13:23.755 "state": "online", 00:13:23.755 "raid_level": "raid1", 00:13:23.755 "superblock": true, 00:13:23.755 "num_base_bdevs": 2, 00:13:23.755 "num_base_bdevs_discovered": 2, 00:13:23.755 "num_base_bdevs_operational": 2, 00:13:23.755 "base_bdevs_list": [ 00:13:23.755 { 00:13:23.755 "name": "pt1", 00:13:23.755 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:23.755 "is_configured": true, 00:13:23.755 "data_offset": 2048, 00:13:23.755 "data_size": 63488 00:13:23.756 }, 00:13:23.756 { 00:13:23.756 "name": "pt2", 00:13:23.756 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:23.756 "is_configured": true, 00:13:23.756 "data_offset": 2048, 00:13:23.756 "data_size": 63488 00:13:23.756 } 00:13:23.756 ] 00:13:23.756 } 00:13:23.756 } 00:13:23.756 }' 00:13:23.756 20:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:23.756 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:23.756 pt2' 00:13:23.756 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:23.756 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:23.756 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:24.014 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:24.014 "name": "pt1", 00:13:24.014 "aliases": [ 00:13:24.014 "00000000-0000-0000-0000-000000000001" 00:13:24.014 ], 00:13:24.014 "product_name": "passthru", 00:13:24.014 "block_size": 512, 00:13:24.014 "num_blocks": 65536, 00:13:24.014 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:24.014 "assigned_rate_limits": { 00:13:24.014 "rw_ios_per_sec": 0, 00:13:24.014 "rw_mbytes_per_sec": 0, 00:13:24.014 "r_mbytes_per_sec": 0, 00:13:24.014 "w_mbytes_per_sec": 0 00:13:24.014 }, 00:13:24.014 "claimed": true, 00:13:24.014 "claim_type": "exclusive_write", 00:13:24.014 "zoned": false, 00:13:24.014 "supported_io_types": { 00:13:24.014 "read": true, 00:13:24.014 "write": true, 00:13:24.014 "unmap": true, 00:13:24.014 "flush": true, 00:13:24.014 "reset": true, 00:13:24.014 "nvme_admin": false, 00:13:24.014 "nvme_io": false, 00:13:24.014 "nvme_io_md": false, 00:13:24.014 "write_zeroes": true, 00:13:24.014 "zcopy": true, 00:13:24.014 "get_zone_info": false, 00:13:24.014 "zone_management": false, 00:13:24.014 "zone_append": false, 00:13:24.014 "compare": false, 00:13:24.014 "compare_and_write": false, 00:13:24.014 "abort": true, 00:13:24.014 "seek_hole": false, 00:13:24.014 "seek_data": false, 00:13:24.014 "copy": true, 00:13:24.014 "nvme_iov_md": false 00:13:24.014 }, 00:13:24.014 "memory_domains": [ 00:13:24.014 { 00:13:24.014 "dma_device_id": "system", 00:13:24.014 "dma_device_type": 1 00:13:24.014 }, 00:13:24.014 { 00:13:24.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.014 "dma_device_type": 2 00:13:24.014 } 00:13:24.014 ], 00:13:24.014 "driver_specific": { 00:13:24.014 "passthru": { 00:13:24.014 "name": "pt1", 00:13:24.014 "base_bdev_name": "malloc1" 00:13:24.014 } 00:13:24.014 } 00:13:24.014 }' 00:13:24.014 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.014 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.014 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:24.014 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.014 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.273 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:24.273 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.273 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.273 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:24.273 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.273 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.273 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:24.273 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:24.273 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:24.273 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:24.531 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:24.531 "name": "pt2", 00:13:24.531 "aliases": [ 00:13:24.531 "00000000-0000-0000-0000-000000000002" 00:13:24.531 ], 00:13:24.531 "product_name": "passthru", 00:13:24.531 "block_size": 512, 00:13:24.531 "num_blocks": 65536, 00:13:24.531 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:24.531 "assigned_rate_limits": { 00:13:24.531 "rw_ios_per_sec": 0, 00:13:24.531 "rw_mbytes_per_sec": 0, 00:13:24.531 "r_mbytes_per_sec": 0, 00:13:24.531 "w_mbytes_per_sec": 0 00:13:24.531 }, 00:13:24.531 "claimed": true, 00:13:24.531 "claim_type": "exclusive_write", 00:13:24.531 "zoned": false, 00:13:24.531 "supported_io_types": { 00:13:24.531 "read": true, 00:13:24.531 "write": true, 00:13:24.531 "unmap": true, 00:13:24.531 "flush": true, 00:13:24.531 "reset": true, 00:13:24.531 "nvme_admin": false, 00:13:24.531 "nvme_io": false, 00:13:24.531 "nvme_io_md": false, 00:13:24.531 "write_zeroes": true, 00:13:24.531 "zcopy": true, 00:13:24.531 "get_zone_info": false, 00:13:24.531 "zone_management": false, 00:13:24.531 "zone_append": false, 00:13:24.531 "compare": false, 00:13:24.531 "compare_and_write": false, 00:13:24.531 "abort": true, 00:13:24.531 "seek_hole": false, 00:13:24.531 "seek_data": false, 00:13:24.531 "copy": true, 00:13:24.531 "nvme_iov_md": false 00:13:24.531 }, 00:13:24.531 "memory_domains": [ 00:13:24.531 { 00:13:24.531 "dma_device_id": "system", 00:13:24.531 "dma_device_type": 1 00:13:24.531 }, 00:13:24.531 { 00:13:24.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.531 "dma_device_type": 2 00:13:24.531 } 00:13:24.531 ], 00:13:24.531 "driver_specific": { 00:13:24.531 "passthru": { 00:13:24.531 "name": "pt2", 00:13:24.531 "base_bdev_name": "malloc2" 00:13:24.531 } 00:13:24.531 } 00:13:24.531 }' 00:13:24.531 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.531 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:24.790 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:24.790 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.790 20:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:24.790 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:24.790 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.790 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:24.790 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:24.790 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:24.790 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.050 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:25.050 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:25.050 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:25.050 [2024-07-15 20:27:17.413136] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d98bddab-4d8c-4410-8511-f14a3a07202f '!=' d98bddab-4d8c-4410-8511-f14a3a07202f ']' 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:25.309 [2024-07-15 20:27:17.657549] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.309 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:25.568 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.568 "name": "raid_bdev1", 00:13:25.568 "uuid": "d98bddab-4d8c-4410-8511-f14a3a07202f", 00:13:25.568 "strip_size_kb": 0, 00:13:25.568 "state": "online", 00:13:25.568 "raid_level": "raid1", 00:13:25.568 "superblock": true, 00:13:25.568 "num_base_bdevs": 2, 00:13:25.568 "num_base_bdevs_discovered": 1, 00:13:25.568 "num_base_bdevs_operational": 1, 00:13:25.568 "base_bdevs_list": [ 00:13:25.568 { 00:13:25.568 "name": null, 00:13:25.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.568 "is_configured": false, 00:13:25.568 "data_offset": 2048, 00:13:25.568 "data_size": 63488 00:13:25.568 }, 00:13:25.568 { 00:13:25.568 "name": "pt2", 00:13:25.568 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:25.568 "is_configured": true, 00:13:25.568 "data_offset": 2048, 00:13:25.568 "data_size": 63488 00:13:25.568 } 00:13:25.568 ] 00:13:25.568 }' 00:13:25.568 20:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.568 20:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:26.505 20:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:26.505 [2024-07-15 20:27:18.752418] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:26.505 [2024-07-15 20:27:18.752445] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:26.505 [2024-07-15 20:27:18.752494] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:26.505 [2024-07-15 20:27:18.752536] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:26.505 [2024-07-15 20:27:18.752548] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dcc590 name raid_bdev1, state offline 00:13:26.505 20:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.505 20:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:13:26.763 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:13:26.763 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:13:26.763 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:13:26.763 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:26.763 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:27.023 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:13:27.023 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:27.023 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:13:27.023 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:13:27.023 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:13:27.023 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:27.282 [2024-07-15 20:27:19.518398] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:27.282 [2024-07-15 20:27:19.518439] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:27.282 [2024-07-15 20:27:19.518456] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c34160 00:13:27.282 [2024-07-15 20:27:19.518468] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:27.282 [2024-07-15 20:27:19.520062] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:27.282 [2024-07-15 20:27:19.520090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:27.282 [2024-07-15 20:27:19.520151] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:27.282 [2024-07-15 20:27:19.520175] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:27.282 [2024-07-15 20:27:19.520262] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c2a380 00:13:27.282 [2024-07-15 20:27:19.520273] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:27.282 [2024-07-15 20:27:19.520441] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c2ba80 00:13:27.282 [2024-07-15 20:27:19.520560] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c2a380 00:13:27.282 [2024-07-15 20:27:19.520570] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c2a380 00:13:27.282 [2024-07-15 20:27:19.520664] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:27.282 pt2 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.282 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:27.541 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:27.541 "name": "raid_bdev1", 00:13:27.541 "uuid": "d98bddab-4d8c-4410-8511-f14a3a07202f", 00:13:27.541 "strip_size_kb": 0, 00:13:27.541 "state": "online", 00:13:27.541 "raid_level": "raid1", 00:13:27.541 "superblock": true, 00:13:27.541 "num_base_bdevs": 2, 00:13:27.541 "num_base_bdevs_discovered": 1, 00:13:27.541 "num_base_bdevs_operational": 1, 00:13:27.541 "base_bdevs_list": [ 00:13:27.541 { 00:13:27.541 "name": null, 00:13:27.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:27.541 "is_configured": false, 00:13:27.541 "data_offset": 2048, 00:13:27.541 "data_size": 63488 00:13:27.541 }, 00:13:27.541 { 00:13:27.541 "name": "pt2", 00:13:27.541 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:27.541 "is_configured": true, 00:13:27.541 "data_offset": 2048, 00:13:27.541 "data_size": 63488 00:13:27.541 } 00:13:27.541 ] 00:13:27.541 }' 00:13:27.541 20:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:27.541 20:27:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.107 20:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:28.366 [2024-07-15 20:27:20.605281] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:28.366 [2024-07-15 20:27:20.605305] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:28.366 [2024-07-15 20:27:20.605356] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:28.366 [2024-07-15 20:27:20.605399] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:28.366 [2024-07-15 20:27:20.605410] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c2a380 name raid_bdev1, state offline 00:13:28.366 20:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.366 20:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:13:28.625 20:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:13:28.625 20:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:13:28.625 20:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:13:28.625 20:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:28.883 [2024-07-15 20:27:21.110615] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:28.883 [2024-07-15 20:27:21.110657] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:28.883 [2024-07-15 20:27:21.110674] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dd6520 00:13:28.883 [2024-07-15 20:27:21.110686] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:28.883 [2024-07-15 20:27:21.112277] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:28.883 [2024-07-15 20:27:21.112304] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:28.883 [2024-07-15 20:27:21.112366] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:28.883 [2024-07-15 20:27:21.112390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:28.883 [2024-07-15 20:27:21.112485] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:13:28.883 [2024-07-15 20:27:21.112498] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:28.883 [2024-07-15 20:27:21.112510] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c2b3f0 name raid_bdev1, state configuring 00:13:28.883 [2024-07-15 20:27:21.112532] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:28.883 [2024-07-15 20:27:21.112591] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c2d2b0 00:13:28.883 [2024-07-15 20:27:21.112602] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:28.883 [2024-07-15 20:27:21.112759] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c2a350 00:13:28.883 [2024-07-15 20:27:21.112879] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c2d2b0 00:13:28.883 [2024-07-15 20:27:21.112889] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c2d2b0 00:13:28.883 [2024-07-15 20:27:21.112997] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:28.883 pt1 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.883 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:29.142 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.142 "name": "raid_bdev1", 00:13:29.142 "uuid": "d98bddab-4d8c-4410-8511-f14a3a07202f", 00:13:29.142 "strip_size_kb": 0, 00:13:29.142 "state": "online", 00:13:29.142 "raid_level": "raid1", 00:13:29.142 "superblock": true, 00:13:29.142 "num_base_bdevs": 2, 00:13:29.142 "num_base_bdevs_discovered": 1, 00:13:29.142 "num_base_bdevs_operational": 1, 00:13:29.142 "base_bdevs_list": [ 00:13:29.142 { 00:13:29.142 "name": null, 00:13:29.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:29.142 "is_configured": false, 00:13:29.142 "data_offset": 2048, 00:13:29.142 "data_size": 63488 00:13:29.142 }, 00:13:29.142 { 00:13:29.142 "name": "pt2", 00:13:29.142 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:29.142 "is_configured": true, 00:13:29.142 "data_offset": 2048, 00:13:29.142 "data_size": 63488 00:13:29.142 } 00:13:29.142 ] 00:13:29.142 }' 00:13:29.142 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.142 20:27:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.708 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:29.708 20:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:29.966 20:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:13:29.966 20:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:29.966 20:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:13:30.225 [2024-07-15 20:27:22.438359] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' d98bddab-4d8c-4410-8511-f14a3a07202f '!=' d98bddab-4d8c-4410-8511-f14a3a07202f ']' 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1365536 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1365536 ']' 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1365536 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1365536 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1365536' 00:13:30.225 killing process with pid 1365536 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1365536 00:13:30.225 [2024-07-15 20:27:22.506576] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:30.225 [2024-07-15 20:27:22.506626] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:30.225 [2024-07-15 20:27:22.506669] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:30.225 [2024-07-15 20:27:22.506681] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c2d2b0 name raid_bdev1, state offline 00:13:30.225 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1365536 00:13:30.225 [2024-07-15 20:27:22.525871] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:30.484 20:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:30.484 00:13:30.484 real 0m18.946s 00:13:30.484 user 0m34.887s 00:13:30.484 sys 0m3.390s 00:13:30.484 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:30.484 20:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.484 ************************************ 00:13:30.484 END TEST raid_superblock_test 00:13:30.484 ************************************ 00:13:30.484 20:27:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:30.484 20:27:22 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:13:30.484 20:27:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:30.484 20:27:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:30.484 20:27:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:30.484 ************************************ 00:13:30.484 START TEST raid_read_error_test 00:13:30.484 ************************************ 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.c7QDgryXnL 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1368696 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1368696 /var/tmp/spdk-raid.sock 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1368696 ']' 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:30.484 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:30.484 20:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.742 [2024-07-15 20:27:22.907883] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:13:30.742 [2024-07-15 20:27:22.907958] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1368696 ] 00:13:30.742 [2024-07-15 20:27:23.034937] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:31.000 [2024-07-15 20:27:23.133342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:31.000 [2024-07-15 20:27:23.196258] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:31.000 [2024-07-15 20:27:23.196304] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:31.565 20:27:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:31.565 20:27:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:31.565 20:27:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:31.565 20:27:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:31.823 BaseBdev1_malloc 00:13:31.823 20:27:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:32.081 true 00:13:32.081 20:27:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:32.339 [2024-07-15 20:27:24.568073] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:32.339 [2024-07-15 20:27:24.568117] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:32.339 [2024-07-15 20:27:24.568140] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14260d0 00:13:32.339 [2024-07-15 20:27:24.568153] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:32.339 [2024-07-15 20:27:24.570053] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:32.340 [2024-07-15 20:27:24.570083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:32.340 BaseBdev1 00:13:32.340 20:27:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:32.340 20:27:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:32.598 BaseBdev2_malloc 00:13:32.598 20:27:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:32.856 true 00:13:32.856 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:33.114 [2024-07-15 20:27:25.295772] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:33.115 [2024-07-15 20:27:25.295816] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:33.115 [2024-07-15 20:27:25.295838] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x142a910 00:13:33.115 [2024-07-15 20:27:25.295850] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:33.115 [2024-07-15 20:27:25.297407] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:33.115 [2024-07-15 20:27:25.297435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:33.115 BaseBdev2 00:13:33.115 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:33.373 [2024-07-15 20:27:25.536440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:33.373 [2024-07-15 20:27:25.537800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:33.373 [2024-07-15 20:27:25.537999] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x142c320 00:13:33.373 [2024-07-15 20:27:25.538014] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:33.373 [2024-07-15 20:27:25.538207] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1293d00 00:13:33.373 [2024-07-15 20:27:25.538361] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x142c320 00:13:33.373 [2024-07-15 20:27:25.538372] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x142c320 00:13:33.373 [2024-07-15 20:27:25.538485] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.373 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:33.632 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:33.632 "name": "raid_bdev1", 00:13:33.632 "uuid": "a76eeba9-7623-4acb-9daa-0d3bdce0681c", 00:13:33.632 "strip_size_kb": 0, 00:13:33.632 "state": "online", 00:13:33.632 "raid_level": "raid1", 00:13:33.632 "superblock": true, 00:13:33.632 "num_base_bdevs": 2, 00:13:33.632 "num_base_bdevs_discovered": 2, 00:13:33.632 "num_base_bdevs_operational": 2, 00:13:33.632 "base_bdevs_list": [ 00:13:33.632 { 00:13:33.632 "name": "BaseBdev1", 00:13:33.632 "uuid": "afeb1174-ac42-5256-bde3-560f3a1c4bf1", 00:13:33.632 "is_configured": true, 00:13:33.632 "data_offset": 2048, 00:13:33.632 "data_size": 63488 00:13:33.632 }, 00:13:33.632 { 00:13:33.632 "name": "BaseBdev2", 00:13:33.632 "uuid": "6ed6862f-2b15-5c3b-b98f-6b999bbd3c55", 00:13:33.632 "is_configured": true, 00:13:33.632 "data_offset": 2048, 00:13:33.632 "data_size": 63488 00:13:33.632 } 00:13:33.632 ] 00:13:33.632 }' 00:13:33.632 20:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:33.632 20:27:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.199 20:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:34.199 20:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:34.199 [2024-07-15 20:27:26.543390] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1427c70 00:13:35.134 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.392 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:35.651 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.651 "name": "raid_bdev1", 00:13:35.651 "uuid": "a76eeba9-7623-4acb-9daa-0d3bdce0681c", 00:13:35.651 "strip_size_kb": 0, 00:13:35.651 "state": "online", 00:13:35.651 "raid_level": "raid1", 00:13:35.651 "superblock": true, 00:13:35.651 "num_base_bdevs": 2, 00:13:35.651 "num_base_bdevs_discovered": 2, 00:13:35.651 "num_base_bdevs_operational": 2, 00:13:35.651 "base_bdevs_list": [ 00:13:35.651 { 00:13:35.651 "name": "BaseBdev1", 00:13:35.651 "uuid": "afeb1174-ac42-5256-bde3-560f3a1c4bf1", 00:13:35.651 "is_configured": true, 00:13:35.651 "data_offset": 2048, 00:13:35.651 "data_size": 63488 00:13:35.651 }, 00:13:35.651 { 00:13:35.651 "name": "BaseBdev2", 00:13:35.651 "uuid": "6ed6862f-2b15-5c3b-b98f-6b999bbd3c55", 00:13:35.651 "is_configured": true, 00:13:35.651 "data_offset": 2048, 00:13:35.651 "data_size": 63488 00:13:35.651 } 00:13:35.651 ] 00:13:35.651 }' 00:13:35.651 20:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.651 20:27:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:36.586 20:27:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:36.844 [2024-07-15 20:27:29.066595] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:36.844 [2024-07-15 20:27:29.066637] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:36.844 [2024-07-15 20:27:29.069778] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:36.844 [2024-07-15 20:27:29.069810] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:36.844 [2024-07-15 20:27:29.069890] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:36.844 [2024-07-15 20:27:29.069902] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x142c320 name raid_bdev1, state offline 00:13:36.844 0 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1368696 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1368696 ']' 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1368696 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1368696 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1368696' 00:13:36.844 killing process with pid 1368696 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1368696 00:13:36.844 [2024-07-15 20:27:29.141920] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:36.844 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1368696 00:13:36.844 [2024-07-15 20:27:29.153271] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:37.101 20:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.c7QDgryXnL 00:13:37.101 20:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:37.101 20:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:37.101 20:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:37.101 20:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:37.101 20:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:37.101 20:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:37.101 20:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:37.101 00:13:37.101 real 0m6.565s 00:13:37.101 user 0m10.352s 00:13:37.101 sys 0m1.136s 00:13:37.101 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:37.101 20:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.101 ************************************ 00:13:37.101 END TEST raid_read_error_test 00:13:37.101 ************************************ 00:13:37.101 20:27:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:37.101 20:27:29 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:13:37.101 20:27:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:37.101 20:27:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:37.101 20:27:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:37.372 ************************************ 00:13:37.372 START TEST raid_write_error_test 00:13:37.372 ************************************ 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.bWgKs5rhQy 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1369666 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1369666 /var/tmp/spdk-raid.sock 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1369666 ']' 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:37.372 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:37.372 20:27:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.372 [2024-07-15 20:27:29.554472] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:13:37.372 [2024-07-15 20:27:29.554528] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1369666 ] 00:13:37.372 [2024-07-15 20:27:29.666302] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:37.630 [2024-07-15 20:27:29.772897] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:37.630 [2024-07-15 20:27:29.837705] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:37.630 [2024-07-15 20:27:29.837745] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:38.194 20:27:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:38.194 20:27:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:38.194 20:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:38.194 20:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:38.452 BaseBdev1_malloc 00:13:38.452 20:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:38.709 true 00:13:38.709 20:27:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:38.968 [2024-07-15 20:27:31.226797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:38.968 [2024-07-15 20:27:31.226844] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:38.968 [2024-07-15 20:27:31.226863] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a370d0 00:13:38.968 [2024-07-15 20:27:31.226875] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:38.968 [2024-07-15 20:27:31.228560] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:38.968 [2024-07-15 20:27:31.228587] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:38.968 BaseBdev1 00:13:38.968 20:27:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:38.968 20:27:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:39.227 BaseBdev2_malloc 00:13:39.227 20:27:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:39.485 true 00:13:39.485 20:27:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:39.743 [2024-07-15 20:27:31.981385] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:39.743 [2024-07-15 20:27:31.981432] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.743 [2024-07-15 20:27:31.981452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a3b910 00:13:39.743 [2024-07-15 20:27:31.981466] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.744 [2024-07-15 20:27:31.982969] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.744 [2024-07-15 20:27:31.982995] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:39.744 BaseBdev2 00:13:39.744 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:40.002 [2024-07-15 20:27:32.230065] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:40.002 [2024-07-15 20:27:32.231268] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:40.002 [2024-07-15 20:27:32.231456] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a3d320 00:13:40.002 [2024-07-15 20:27:32.231469] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:40.002 [2024-07-15 20:27:32.231654] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a4d00 00:13:40.002 [2024-07-15 20:27:32.231803] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a3d320 00:13:40.002 [2024-07-15 20:27:32.231813] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a3d320 00:13:40.002 [2024-07-15 20:27:32.231916] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.002 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:40.261 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.261 "name": "raid_bdev1", 00:13:40.261 "uuid": "c1c55af1-9dfb-410c-8e0c-22a32f5dd061", 00:13:40.261 "strip_size_kb": 0, 00:13:40.261 "state": "online", 00:13:40.261 "raid_level": "raid1", 00:13:40.261 "superblock": true, 00:13:40.261 "num_base_bdevs": 2, 00:13:40.261 "num_base_bdevs_discovered": 2, 00:13:40.261 "num_base_bdevs_operational": 2, 00:13:40.261 "base_bdevs_list": [ 00:13:40.261 { 00:13:40.261 "name": "BaseBdev1", 00:13:40.261 "uuid": "23f071da-89c7-5933-aa97-a6883f20cadf", 00:13:40.261 "is_configured": true, 00:13:40.261 "data_offset": 2048, 00:13:40.261 "data_size": 63488 00:13:40.261 }, 00:13:40.261 { 00:13:40.261 "name": "BaseBdev2", 00:13:40.261 "uuid": "37094340-8da6-5409-991f-62680f3c1d24", 00:13:40.261 "is_configured": true, 00:13:40.261 "data_offset": 2048, 00:13:40.261 "data_size": 63488 00:13:40.261 } 00:13:40.261 ] 00:13:40.261 }' 00:13:40.261 20:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.261 20:27:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.829 20:27:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:40.829 20:27:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:40.829 [2024-07-15 20:27:33.204965] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a38c70 00:13:41.764 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:42.024 [2024-07-15 20:27:34.341772] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:42.024 [2024-07-15 20:27:34.341830] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:42.024 [2024-07-15 20:27:34.342016] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1a38c70 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.024 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:42.283 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.283 "name": "raid_bdev1", 00:13:42.283 "uuid": "c1c55af1-9dfb-410c-8e0c-22a32f5dd061", 00:13:42.283 "strip_size_kb": 0, 00:13:42.283 "state": "online", 00:13:42.283 "raid_level": "raid1", 00:13:42.283 "superblock": true, 00:13:42.284 "num_base_bdevs": 2, 00:13:42.284 "num_base_bdevs_discovered": 1, 00:13:42.284 "num_base_bdevs_operational": 1, 00:13:42.284 "base_bdevs_list": [ 00:13:42.284 { 00:13:42.284 "name": null, 00:13:42.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:42.284 "is_configured": false, 00:13:42.284 "data_offset": 2048, 00:13:42.284 "data_size": 63488 00:13:42.284 }, 00:13:42.284 { 00:13:42.284 "name": "BaseBdev2", 00:13:42.284 "uuid": "37094340-8da6-5409-991f-62680f3c1d24", 00:13:42.284 "is_configured": true, 00:13:42.284 "data_offset": 2048, 00:13:42.284 "data_size": 63488 00:13:42.284 } 00:13:42.284 ] 00:13:42.284 }' 00:13:42.284 20:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.284 20:27:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:42.852 20:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:43.111 [2024-07-15 20:27:35.373442] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:43.111 [2024-07-15 20:27:35.373481] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:43.111 [2024-07-15 20:27:35.376607] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:43.111 [2024-07-15 20:27:35.376635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:43.111 [2024-07-15 20:27:35.376688] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:43.111 [2024-07-15 20:27:35.376700] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a3d320 name raid_bdev1, state offline 00:13:43.111 0 00:13:43.111 20:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1369666 00:13:43.111 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1369666 ']' 00:13:43.111 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1369666 00:13:43.111 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:43.111 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:43.111 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1369666 00:13:43.111 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:43.112 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:43.112 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1369666' 00:13:43.112 killing process with pid 1369666 00:13:43.112 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1369666 00:13:43.112 [2024-07-15 20:27:35.447136] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:43.112 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1369666 00:13:43.112 [2024-07-15 20:27:35.457266] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:43.371 20:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.bWgKs5rhQy 00:13:43.371 20:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:43.371 20:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:43.371 20:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:43.371 20:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:43.371 20:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:43.371 20:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:43.371 20:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:43.371 00:13:43.371 real 0m6.207s 00:13:43.371 user 0m9.741s 00:13:43.371 sys 0m1.049s 00:13:43.371 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:43.371 20:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.371 ************************************ 00:13:43.371 END TEST raid_write_error_test 00:13:43.371 ************************************ 00:13:43.371 20:27:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:43.371 20:27:35 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:13:43.371 20:27:35 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:43.371 20:27:35 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:43.371 20:27:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:43.371 20:27:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:43.371 20:27:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:43.640 ************************************ 00:13:43.640 START TEST raid_state_function_test 00:13:43.640 ************************************ 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1370505 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1370505' 00:13:43.640 Process raid pid: 1370505 00:13:43.640 20:27:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1370505 /var/tmp/spdk-raid.sock 00:13:43.641 20:27:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1370505 ']' 00:13:43.641 20:27:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:43.641 20:27:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:43.641 20:27:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:43.641 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:43.641 20:27:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:43.641 20:27:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.641 [2024-07-15 20:27:35.848395] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:13:43.641 [2024-07-15 20:27:35.848475] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:43.641 [2024-07-15 20:27:35.991823] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:43.918 [2024-07-15 20:27:36.099384] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:43.918 [2024-07-15 20:27:36.161480] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:43.918 [2024-07-15 20:27:36.161510] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:44.502 20:27:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:44.503 20:27:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:44.503 20:27:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:45.072 [2024-07-15 20:27:37.204390] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:45.072 [2024-07-15 20:27:37.204433] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:45.072 [2024-07-15 20:27:37.204444] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:45.072 [2024-07-15 20:27:37.204456] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:45.072 [2024-07-15 20:27:37.204469] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:45.072 [2024-07-15 20:27:37.204480] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.072 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.331 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.331 "name": "Existed_Raid", 00:13:45.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.331 "strip_size_kb": 64, 00:13:45.331 "state": "configuring", 00:13:45.331 "raid_level": "raid0", 00:13:45.331 "superblock": false, 00:13:45.331 "num_base_bdevs": 3, 00:13:45.331 "num_base_bdevs_discovered": 0, 00:13:45.331 "num_base_bdevs_operational": 3, 00:13:45.331 "base_bdevs_list": [ 00:13:45.331 { 00:13:45.331 "name": "BaseBdev1", 00:13:45.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.331 "is_configured": false, 00:13:45.331 "data_offset": 0, 00:13:45.331 "data_size": 0 00:13:45.331 }, 00:13:45.331 { 00:13:45.331 "name": "BaseBdev2", 00:13:45.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.331 "is_configured": false, 00:13:45.331 "data_offset": 0, 00:13:45.331 "data_size": 0 00:13:45.331 }, 00:13:45.331 { 00:13:45.331 "name": "BaseBdev3", 00:13:45.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.331 "is_configured": false, 00:13:45.331 "data_offset": 0, 00:13:45.331 "data_size": 0 00:13:45.331 } 00:13:45.331 ] 00:13:45.331 }' 00:13:45.331 20:27:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.331 20:27:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.901 20:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:46.160 [2024-07-15 20:27:38.299135] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:46.160 [2024-07-15 20:27:38.299167] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd9fa80 name Existed_Raid, state configuring 00:13:46.160 20:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:46.419 [2024-07-15 20:27:38.547804] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:46.419 [2024-07-15 20:27:38.547833] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:46.419 [2024-07-15 20:27:38.547843] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:46.419 [2024-07-15 20:27:38.547854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:46.419 [2024-07-15 20:27:38.547863] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:46.419 [2024-07-15 20:27:38.547873] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:46.419 20:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:46.678 [2024-07-15 20:27:38.806336] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:46.678 BaseBdev1 00:13:46.678 20:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:46.678 20:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:46.678 20:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:46.678 20:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:46.678 20:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:46.678 20:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:46.678 20:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:46.938 [ 00:13:46.938 { 00:13:46.938 "name": "BaseBdev1", 00:13:46.938 "aliases": [ 00:13:46.938 "59b5e457-dbb5-4d34-8b49-8d6a04be0906" 00:13:46.938 ], 00:13:46.938 "product_name": "Malloc disk", 00:13:46.938 "block_size": 512, 00:13:46.938 "num_blocks": 65536, 00:13:46.938 "uuid": "59b5e457-dbb5-4d34-8b49-8d6a04be0906", 00:13:46.938 "assigned_rate_limits": { 00:13:46.938 "rw_ios_per_sec": 0, 00:13:46.938 "rw_mbytes_per_sec": 0, 00:13:46.938 "r_mbytes_per_sec": 0, 00:13:46.938 "w_mbytes_per_sec": 0 00:13:46.938 }, 00:13:46.938 "claimed": true, 00:13:46.938 "claim_type": "exclusive_write", 00:13:46.938 "zoned": false, 00:13:46.938 "supported_io_types": { 00:13:46.938 "read": true, 00:13:46.938 "write": true, 00:13:46.938 "unmap": true, 00:13:46.938 "flush": true, 00:13:46.938 "reset": true, 00:13:46.938 "nvme_admin": false, 00:13:46.938 "nvme_io": false, 00:13:46.938 "nvme_io_md": false, 00:13:46.938 "write_zeroes": true, 00:13:46.938 "zcopy": true, 00:13:46.938 "get_zone_info": false, 00:13:46.938 "zone_management": false, 00:13:46.938 "zone_append": false, 00:13:46.938 "compare": false, 00:13:46.938 "compare_and_write": false, 00:13:46.938 "abort": true, 00:13:46.938 "seek_hole": false, 00:13:46.938 "seek_data": false, 00:13:46.938 "copy": true, 00:13:46.938 "nvme_iov_md": false 00:13:46.938 }, 00:13:46.938 "memory_domains": [ 00:13:46.938 { 00:13:46.938 "dma_device_id": "system", 00:13:46.938 "dma_device_type": 1 00:13:46.938 }, 00:13:46.938 { 00:13:46.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.938 "dma_device_type": 2 00:13:46.938 } 00:13:46.938 ], 00:13:46.938 "driver_specific": {} 00:13:46.938 } 00:13:46.938 ] 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.938 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.197 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.197 "name": "Existed_Raid", 00:13:47.197 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.197 "strip_size_kb": 64, 00:13:47.197 "state": "configuring", 00:13:47.197 "raid_level": "raid0", 00:13:47.197 "superblock": false, 00:13:47.197 "num_base_bdevs": 3, 00:13:47.197 "num_base_bdevs_discovered": 1, 00:13:47.197 "num_base_bdevs_operational": 3, 00:13:47.197 "base_bdevs_list": [ 00:13:47.197 { 00:13:47.197 "name": "BaseBdev1", 00:13:47.197 "uuid": "59b5e457-dbb5-4d34-8b49-8d6a04be0906", 00:13:47.197 "is_configured": true, 00:13:47.197 "data_offset": 0, 00:13:47.197 "data_size": 65536 00:13:47.197 }, 00:13:47.197 { 00:13:47.197 "name": "BaseBdev2", 00:13:47.197 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.197 "is_configured": false, 00:13:47.197 "data_offset": 0, 00:13:47.197 "data_size": 0 00:13:47.197 }, 00:13:47.197 { 00:13:47.197 "name": "BaseBdev3", 00:13:47.197 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.197 "is_configured": false, 00:13:47.197 "data_offset": 0, 00:13:47.197 "data_size": 0 00:13:47.197 } 00:13:47.197 ] 00:13:47.197 }' 00:13:47.197 20:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.197 20:27:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.134 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:48.134 [2024-07-15 20:27:40.414745] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:48.134 [2024-07-15 20:27:40.414787] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd9f310 name Existed_Raid, state configuring 00:13:48.134 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:48.392 [2024-07-15 20:27:40.655408] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:48.392 [2024-07-15 20:27:40.656867] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:48.392 [2024-07-15 20:27:40.656898] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:48.392 [2024-07-15 20:27:40.656908] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:48.392 [2024-07-15 20:27:40.656919] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:48.392 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:48.392 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:48.392 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:48.392 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:48.392 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:48.392 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:48.393 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:48.393 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:48.393 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.393 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.393 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.393 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.393 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.393 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:48.652 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.652 "name": "Existed_Raid", 00:13:48.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.652 "strip_size_kb": 64, 00:13:48.652 "state": "configuring", 00:13:48.652 "raid_level": "raid0", 00:13:48.652 "superblock": false, 00:13:48.652 "num_base_bdevs": 3, 00:13:48.652 "num_base_bdevs_discovered": 1, 00:13:48.652 "num_base_bdevs_operational": 3, 00:13:48.652 "base_bdevs_list": [ 00:13:48.652 { 00:13:48.652 "name": "BaseBdev1", 00:13:48.652 "uuid": "59b5e457-dbb5-4d34-8b49-8d6a04be0906", 00:13:48.652 "is_configured": true, 00:13:48.652 "data_offset": 0, 00:13:48.652 "data_size": 65536 00:13:48.652 }, 00:13:48.652 { 00:13:48.652 "name": "BaseBdev2", 00:13:48.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.652 "is_configured": false, 00:13:48.652 "data_offset": 0, 00:13:48.652 "data_size": 0 00:13:48.652 }, 00:13:48.652 { 00:13:48.652 "name": "BaseBdev3", 00:13:48.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.652 "is_configured": false, 00:13:48.652 "data_offset": 0, 00:13:48.652 "data_size": 0 00:13:48.652 } 00:13:48.652 ] 00:13:48.652 }' 00:13:48.652 20:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.652 20:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.220 20:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:49.479 [2024-07-15 20:27:41.705578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:49.479 BaseBdev2 00:13:49.480 20:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:49.480 20:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:49.480 20:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:49.480 20:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:49.480 20:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:49.480 20:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:49.480 20:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:49.739 20:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:49.998 [ 00:13:49.998 { 00:13:49.998 "name": "BaseBdev2", 00:13:49.998 "aliases": [ 00:13:49.998 "d9a1bdde-03bc-4095-b6fe-b05057abc349" 00:13:49.998 ], 00:13:49.998 "product_name": "Malloc disk", 00:13:49.998 "block_size": 512, 00:13:49.998 "num_blocks": 65536, 00:13:49.998 "uuid": "d9a1bdde-03bc-4095-b6fe-b05057abc349", 00:13:49.998 "assigned_rate_limits": { 00:13:49.998 "rw_ios_per_sec": 0, 00:13:49.998 "rw_mbytes_per_sec": 0, 00:13:49.998 "r_mbytes_per_sec": 0, 00:13:49.998 "w_mbytes_per_sec": 0 00:13:49.998 }, 00:13:49.998 "claimed": true, 00:13:49.998 "claim_type": "exclusive_write", 00:13:49.998 "zoned": false, 00:13:49.998 "supported_io_types": { 00:13:49.998 "read": true, 00:13:49.998 "write": true, 00:13:49.998 "unmap": true, 00:13:49.998 "flush": true, 00:13:49.998 "reset": true, 00:13:49.998 "nvme_admin": false, 00:13:49.998 "nvme_io": false, 00:13:49.998 "nvme_io_md": false, 00:13:49.998 "write_zeroes": true, 00:13:49.998 "zcopy": true, 00:13:49.998 "get_zone_info": false, 00:13:49.998 "zone_management": false, 00:13:49.998 "zone_append": false, 00:13:49.998 "compare": false, 00:13:49.998 "compare_and_write": false, 00:13:49.998 "abort": true, 00:13:49.998 "seek_hole": false, 00:13:49.998 "seek_data": false, 00:13:49.998 "copy": true, 00:13:49.998 "nvme_iov_md": false 00:13:49.998 }, 00:13:49.998 "memory_domains": [ 00:13:49.998 { 00:13:49.998 "dma_device_id": "system", 00:13:49.998 "dma_device_type": 1 00:13:49.998 }, 00:13:49.998 { 00:13:49.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.998 "dma_device_type": 2 00:13:49.998 } 00:13:49.998 ], 00:13:49.998 "driver_specific": {} 00:13:49.998 } 00:13:49.998 ] 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.998 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.257 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.257 "name": "Existed_Raid", 00:13:50.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.257 "strip_size_kb": 64, 00:13:50.257 "state": "configuring", 00:13:50.257 "raid_level": "raid0", 00:13:50.257 "superblock": false, 00:13:50.257 "num_base_bdevs": 3, 00:13:50.257 "num_base_bdevs_discovered": 2, 00:13:50.257 "num_base_bdevs_operational": 3, 00:13:50.257 "base_bdevs_list": [ 00:13:50.257 { 00:13:50.257 "name": "BaseBdev1", 00:13:50.257 "uuid": "59b5e457-dbb5-4d34-8b49-8d6a04be0906", 00:13:50.257 "is_configured": true, 00:13:50.258 "data_offset": 0, 00:13:50.258 "data_size": 65536 00:13:50.258 }, 00:13:50.258 { 00:13:50.258 "name": "BaseBdev2", 00:13:50.258 "uuid": "d9a1bdde-03bc-4095-b6fe-b05057abc349", 00:13:50.258 "is_configured": true, 00:13:50.258 "data_offset": 0, 00:13:50.258 "data_size": 65536 00:13:50.258 }, 00:13:50.258 { 00:13:50.258 "name": "BaseBdev3", 00:13:50.258 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.258 "is_configured": false, 00:13:50.258 "data_offset": 0, 00:13:50.258 "data_size": 0 00:13:50.258 } 00:13:50.258 ] 00:13:50.258 }' 00:13:50.258 20:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.258 20:27:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.824 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:51.083 [2024-07-15 20:27:43.301211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:51.083 [2024-07-15 20:27:43.301248] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xda0400 00:13:51.083 [2024-07-15 20:27:43.301256] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:51.083 [2024-07-15 20:27:43.301502] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd9fef0 00:13:51.083 [2024-07-15 20:27:43.301618] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xda0400 00:13:51.083 [2024-07-15 20:27:43.301628] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xda0400 00:13:51.083 [2024-07-15 20:27:43.301785] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:51.083 BaseBdev3 00:13:51.083 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:51.083 20:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:51.083 20:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:51.083 20:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:51.083 20:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:51.083 20:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:51.083 20:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:51.341 20:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:51.599 [ 00:13:51.599 { 00:13:51.599 "name": "BaseBdev3", 00:13:51.599 "aliases": [ 00:13:51.599 "140be601-9271-436e-a0ea-f1e944bff614" 00:13:51.599 ], 00:13:51.599 "product_name": "Malloc disk", 00:13:51.599 "block_size": 512, 00:13:51.599 "num_blocks": 65536, 00:13:51.599 "uuid": "140be601-9271-436e-a0ea-f1e944bff614", 00:13:51.599 "assigned_rate_limits": { 00:13:51.599 "rw_ios_per_sec": 0, 00:13:51.599 "rw_mbytes_per_sec": 0, 00:13:51.599 "r_mbytes_per_sec": 0, 00:13:51.599 "w_mbytes_per_sec": 0 00:13:51.599 }, 00:13:51.599 "claimed": true, 00:13:51.599 "claim_type": "exclusive_write", 00:13:51.599 "zoned": false, 00:13:51.599 "supported_io_types": { 00:13:51.599 "read": true, 00:13:51.599 "write": true, 00:13:51.599 "unmap": true, 00:13:51.599 "flush": true, 00:13:51.599 "reset": true, 00:13:51.599 "nvme_admin": false, 00:13:51.599 "nvme_io": false, 00:13:51.599 "nvme_io_md": false, 00:13:51.599 "write_zeroes": true, 00:13:51.599 "zcopy": true, 00:13:51.599 "get_zone_info": false, 00:13:51.599 "zone_management": false, 00:13:51.599 "zone_append": false, 00:13:51.599 "compare": false, 00:13:51.599 "compare_and_write": false, 00:13:51.599 "abort": true, 00:13:51.599 "seek_hole": false, 00:13:51.599 "seek_data": false, 00:13:51.599 "copy": true, 00:13:51.599 "nvme_iov_md": false 00:13:51.599 }, 00:13:51.599 "memory_domains": [ 00:13:51.599 { 00:13:51.599 "dma_device_id": "system", 00:13:51.599 "dma_device_type": 1 00:13:51.599 }, 00:13:51.599 { 00:13:51.599 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.599 "dma_device_type": 2 00:13:51.599 } 00:13:51.599 ], 00:13:51.599 "driver_specific": {} 00:13:51.599 } 00:13:51.599 ] 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.599 20:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:51.858 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:51.858 "name": "Existed_Raid", 00:13:51.858 "uuid": "f8072ab4-48f7-418f-a619-2185ee774b4c", 00:13:51.858 "strip_size_kb": 64, 00:13:51.858 "state": "online", 00:13:51.858 "raid_level": "raid0", 00:13:51.858 "superblock": false, 00:13:51.858 "num_base_bdevs": 3, 00:13:51.858 "num_base_bdevs_discovered": 3, 00:13:51.858 "num_base_bdevs_operational": 3, 00:13:51.858 "base_bdevs_list": [ 00:13:51.858 { 00:13:51.858 "name": "BaseBdev1", 00:13:51.858 "uuid": "59b5e457-dbb5-4d34-8b49-8d6a04be0906", 00:13:51.858 "is_configured": true, 00:13:51.858 "data_offset": 0, 00:13:51.858 "data_size": 65536 00:13:51.858 }, 00:13:51.858 { 00:13:51.858 "name": "BaseBdev2", 00:13:51.858 "uuid": "d9a1bdde-03bc-4095-b6fe-b05057abc349", 00:13:51.858 "is_configured": true, 00:13:51.858 "data_offset": 0, 00:13:51.858 "data_size": 65536 00:13:51.858 }, 00:13:51.858 { 00:13:51.858 "name": "BaseBdev3", 00:13:51.858 "uuid": "140be601-9271-436e-a0ea-f1e944bff614", 00:13:51.858 "is_configured": true, 00:13:51.858 "data_offset": 0, 00:13:51.858 "data_size": 65536 00:13:51.858 } 00:13:51.858 ] 00:13:51.858 }' 00:13:51.858 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:51.858 20:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:52.424 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:52.424 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:52.424 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:52.424 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:52.424 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:52.424 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:52.424 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:52.424 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:52.683 [2024-07-15 20:27:44.877665] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:52.683 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:52.683 "name": "Existed_Raid", 00:13:52.683 "aliases": [ 00:13:52.683 "f8072ab4-48f7-418f-a619-2185ee774b4c" 00:13:52.683 ], 00:13:52.683 "product_name": "Raid Volume", 00:13:52.683 "block_size": 512, 00:13:52.683 "num_blocks": 196608, 00:13:52.683 "uuid": "f8072ab4-48f7-418f-a619-2185ee774b4c", 00:13:52.683 "assigned_rate_limits": { 00:13:52.683 "rw_ios_per_sec": 0, 00:13:52.683 "rw_mbytes_per_sec": 0, 00:13:52.683 "r_mbytes_per_sec": 0, 00:13:52.683 "w_mbytes_per_sec": 0 00:13:52.683 }, 00:13:52.683 "claimed": false, 00:13:52.683 "zoned": false, 00:13:52.683 "supported_io_types": { 00:13:52.683 "read": true, 00:13:52.683 "write": true, 00:13:52.683 "unmap": true, 00:13:52.683 "flush": true, 00:13:52.683 "reset": true, 00:13:52.683 "nvme_admin": false, 00:13:52.683 "nvme_io": false, 00:13:52.683 "nvme_io_md": false, 00:13:52.683 "write_zeroes": true, 00:13:52.683 "zcopy": false, 00:13:52.683 "get_zone_info": false, 00:13:52.683 "zone_management": false, 00:13:52.683 "zone_append": false, 00:13:52.683 "compare": false, 00:13:52.683 "compare_and_write": false, 00:13:52.683 "abort": false, 00:13:52.683 "seek_hole": false, 00:13:52.683 "seek_data": false, 00:13:52.683 "copy": false, 00:13:52.683 "nvme_iov_md": false 00:13:52.683 }, 00:13:52.683 "memory_domains": [ 00:13:52.683 { 00:13:52.683 "dma_device_id": "system", 00:13:52.683 "dma_device_type": 1 00:13:52.683 }, 00:13:52.683 { 00:13:52.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.683 "dma_device_type": 2 00:13:52.683 }, 00:13:52.683 { 00:13:52.683 "dma_device_id": "system", 00:13:52.683 "dma_device_type": 1 00:13:52.683 }, 00:13:52.683 { 00:13:52.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.683 "dma_device_type": 2 00:13:52.683 }, 00:13:52.683 { 00:13:52.683 "dma_device_id": "system", 00:13:52.683 "dma_device_type": 1 00:13:52.683 }, 00:13:52.683 { 00:13:52.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.683 "dma_device_type": 2 00:13:52.683 } 00:13:52.683 ], 00:13:52.683 "driver_specific": { 00:13:52.683 "raid": { 00:13:52.683 "uuid": "f8072ab4-48f7-418f-a619-2185ee774b4c", 00:13:52.683 "strip_size_kb": 64, 00:13:52.683 "state": "online", 00:13:52.683 "raid_level": "raid0", 00:13:52.683 "superblock": false, 00:13:52.683 "num_base_bdevs": 3, 00:13:52.683 "num_base_bdevs_discovered": 3, 00:13:52.683 "num_base_bdevs_operational": 3, 00:13:52.683 "base_bdevs_list": [ 00:13:52.683 { 00:13:52.683 "name": "BaseBdev1", 00:13:52.683 "uuid": "59b5e457-dbb5-4d34-8b49-8d6a04be0906", 00:13:52.683 "is_configured": true, 00:13:52.683 "data_offset": 0, 00:13:52.683 "data_size": 65536 00:13:52.683 }, 00:13:52.683 { 00:13:52.683 "name": "BaseBdev2", 00:13:52.683 "uuid": "d9a1bdde-03bc-4095-b6fe-b05057abc349", 00:13:52.683 "is_configured": true, 00:13:52.683 "data_offset": 0, 00:13:52.683 "data_size": 65536 00:13:52.683 }, 00:13:52.683 { 00:13:52.683 "name": "BaseBdev3", 00:13:52.683 "uuid": "140be601-9271-436e-a0ea-f1e944bff614", 00:13:52.684 "is_configured": true, 00:13:52.684 "data_offset": 0, 00:13:52.684 "data_size": 65536 00:13:52.684 } 00:13:52.684 ] 00:13:52.684 } 00:13:52.684 } 00:13:52.684 }' 00:13:52.684 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:52.684 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:52.684 BaseBdev2 00:13:52.684 BaseBdev3' 00:13:52.684 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:52.684 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:52.684 20:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:52.943 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:52.943 "name": "BaseBdev1", 00:13:52.943 "aliases": [ 00:13:52.943 "59b5e457-dbb5-4d34-8b49-8d6a04be0906" 00:13:52.943 ], 00:13:52.943 "product_name": "Malloc disk", 00:13:52.943 "block_size": 512, 00:13:52.943 "num_blocks": 65536, 00:13:52.943 "uuid": "59b5e457-dbb5-4d34-8b49-8d6a04be0906", 00:13:52.943 "assigned_rate_limits": { 00:13:52.943 "rw_ios_per_sec": 0, 00:13:52.943 "rw_mbytes_per_sec": 0, 00:13:52.943 "r_mbytes_per_sec": 0, 00:13:52.943 "w_mbytes_per_sec": 0 00:13:52.943 }, 00:13:52.943 "claimed": true, 00:13:52.943 "claim_type": "exclusive_write", 00:13:52.943 "zoned": false, 00:13:52.943 "supported_io_types": { 00:13:52.943 "read": true, 00:13:52.943 "write": true, 00:13:52.943 "unmap": true, 00:13:52.943 "flush": true, 00:13:52.943 "reset": true, 00:13:52.943 "nvme_admin": false, 00:13:52.943 "nvme_io": false, 00:13:52.943 "nvme_io_md": false, 00:13:52.943 "write_zeroes": true, 00:13:52.943 "zcopy": true, 00:13:52.943 "get_zone_info": false, 00:13:52.943 "zone_management": false, 00:13:52.943 "zone_append": false, 00:13:52.943 "compare": false, 00:13:52.943 "compare_and_write": false, 00:13:52.943 "abort": true, 00:13:52.943 "seek_hole": false, 00:13:52.943 "seek_data": false, 00:13:52.943 "copy": true, 00:13:52.943 "nvme_iov_md": false 00:13:52.943 }, 00:13:52.943 "memory_domains": [ 00:13:52.943 { 00:13:52.943 "dma_device_id": "system", 00:13:52.943 "dma_device_type": 1 00:13:52.943 }, 00:13:52.943 { 00:13:52.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.943 "dma_device_type": 2 00:13:52.943 } 00:13:52.943 ], 00:13:52.943 "driver_specific": {} 00:13:52.943 }' 00:13:52.943 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.943 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.943 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:52.943 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:53.201 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:53.460 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:53.460 "name": "BaseBdev2", 00:13:53.460 "aliases": [ 00:13:53.460 "d9a1bdde-03bc-4095-b6fe-b05057abc349" 00:13:53.460 ], 00:13:53.460 "product_name": "Malloc disk", 00:13:53.460 "block_size": 512, 00:13:53.460 "num_blocks": 65536, 00:13:53.460 "uuid": "d9a1bdde-03bc-4095-b6fe-b05057abc349", 00:13:53.460 "assigned_rate_limits": { 00:13:53.460 "rw_ios_per_sec": 0, 00:13:53.460 "rw_mbytes_per_sec": 0, 00:13:53.460 "r_mbytes_per_sec": 0, 00:13:53.460 "w_mbytes_per_sec": 0 00:13:53.460 }, 00:13:53.460 "claimed": true, 00:13:53.460 "claim_type": "exclusive_write", 00:13:53.460 "zoned": false, 00:13:53.460 "supported_io_types": { 00:13:53.460 "read": true, 00:13:53.460 "write": true, 00:13:53.460 "unmap": true, 00:13:53.460 "flush": true, 00:13:53.460 "reset": true, 00:13:53.460 "nvme_admin": false, 00:13:53.460 "nvme_io": false, 00:13:53.460 "nvme_io_md": false, 00:13:53.460 "write_zeroes": true, 00:13:53.460 "zcopy": true, 00:13:53.460 "get_zone_info": false, 00:13:53.460 "zone_management": false, 00:13:53.460 "zone_append": false, 00:13:53.460 "compare": false, 00:13:53.460 "compare_and_write": false, 00:13:53.460 "abort": true, 00:13:53.460 "seek_hole": false, 00:13:53.460 "seek_data": false, 00:13:53.460 "copy": true, 00:13:53.460 "nvme_iov_md": false 00:13:53.460 }, 00:13:53.460 "memory_domains": [ 00:13:53.460 { 00:13:53.460 "dma_device_id": "system", 00:13:53.460 "dma_device_type": 1 00:13:53.460 }, 00:13:53.460 { 00:13:53.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.460 "dma_device_type": 2 00:13:53.460 } 00:13:53.460 ], 00:13:53.460 "driver_specific": {} 00:13:53.460 }' 00:13:53.460 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.718 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.718 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:53.718 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.718 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.718 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:53.718 20:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.718 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.718 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:53.718 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.977 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.977 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:53.977 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:53.977 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:53.977 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:54.235 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:54.235 "name": "BaseBdev3", 00:13:54.235 "aliases": [ 00:13:54.235 "140be601-9271-436e-a0ea-f1e944bff614" 00:13:54.235 ], 00:13:54.235 "product_name": "Malloc disk", 00:13:54.235 "block_size": 512, 00:13:54.235 "num_blocks": 65536, 00:13:54.235 "uuid": "140be601-9271-436e-a0ea-f1e944bff614", 00:13:54.235 "assigned_rate_limits": { 00:13:54.235 "rw_ios_per_sec": 0, 00:13:54.235 "rw_mbytes_per_sec": 0, 00:13:54.235 "r_mbytes_per_sec": 0, 00:13:54.235 "w_mbytes_per_sec": 0 00:13:54.235 }, 00:13:54.235 "claimed": true, 00:13:54.235 "claim_type": "exclusive_write", 00:13:54.235 "zoned": false, 00:13:54.235 "supported_io_types": { 00:13:54.235 "read": true, 00:13:54.235 "write": true, 00:13:54.235 "unmap": true, 00:13:54.235 "flush": true, 00:13:54.235 "reset": true, 00:13:54.235 "nvme_admin": false, 00:13:54.235 "nvme_io": false, 00:13:54.235 "nvme_io_md": false, 00:13:54.235 "write_zeroes": true, 00:13:54.235 "zcopy": true, 00:13:54.235 "get_zone_info": false, 00:13:54.235 "zone_management": false, 00:13:54.235 "zone_append": false, 00:13:54.235 "compare": false, 00:13:54.235 "compare_and_write": false, 00:13:54.235 "abort": true, 00:13:54.235 "seek_hole": false, 00:13:54.235 "seek_data": false, 00:13:54.235 "copy": true, 00:13:54.235 "nvme_iov_md": false 00:13:54.235 }, 00:13:54.235 "memory_domains": [ 00:13:54.235 { 00:13:54.235 "dma_device_id": "system", 00:13:54.235 "dma_device_type": 1 00:13:54.235 }, 00:13:54.235 { 00:13:54.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.235 "dma_device_type": 2 00:13:54.235 } 00:13:54.235 ], 00:13:54.235 "driver_specific": {} 00:13:54.235 }' 00:13:54.235 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.235 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.235 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:54.235 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.235 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.235 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:54.235 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.493 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.493 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:54.493 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.493 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.493 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:54.493 20:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:54.750 [2024-07-15 20:27:46.979001] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:54.750 [2024-07-15 20:27:46.979028] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:54.750 [2024-07-15 20:27:46.979067] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.750 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.008 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.008 "name": "Existed_Raid", 00:13:55.008 "uuid": "f8072ab4-48f7-418f-a619-2185ee774b4c", 00:13:55.008 "strip_size_kb": 64, 00:13:55.008 "state": "offline", 00:13:55.008 "raid_level": "raid0", 00:13:55.008 "superblock": false, 00:13:55.008 "num_base_bdevs": 3, 00:13:55.008 "num_base_bdevs_discovered": 2, 00:13:55.008 "num_base_bdevs_operational": 2, 00:13:55.008 "base_bdevs_list": [ 00:13:55.008 { 00:13:55.008 "name": null, 00:13:55.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.008 "is_configured": false, 00:13:55.008 "data_offset": 0, 00:13:55.008 "data_size": 65536 00:13:55.008 }, 00:13:55.008 { 00:13:55.008 "name": "BaseBdev2", 00:13:55.008 "uuid": "d9a1bdde-03bc-4095-b6fe-b05057abc349", 00:13:55.008 "is_configured": true, 00:13:55.008 "data_offset": 0, 00:13:55.008 "data_size": 65536 00:13:55.008 }, 00:13:55.008 { 00:13:55.008 "name": "BaseBdev3", 00:13:55.008 "uuid": "140be601-9271-436e-a0ea-f1e944bff614", 00:13:55.008 "is_configured": true, 00:13:55.008 "data_offset": 0, 00:13:55.008 "data_size": 65536 00:13:55.008 } 00:13:55.008 ] 00:13:55.008 }' 00:13:55.008 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.008 20:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:55.573 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:55.573 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:55.573 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.573 20:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:55.831 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:55.831 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:55.831 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:56.089 [2024-07-15 20:27:48.320454] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:56.089 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:56.089 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:56.089 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.089 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:56.347 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:56.347 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:56.347 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:56.669 [2024-07-15 20:27:48.813811] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:56.669 [2024-07-15 20:27:48.813854] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xda0400 name Existed_Raid, state offline 00:13:56.669 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:56.669 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:56.669 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.669 20:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:56.928 20:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:56.928 20:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:56.928 20:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:56.928 20:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:56.928 20:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:56.928 20:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:57.187 BaseBdev2 00:13:57.187 20:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:57.187 20:27:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:57.187 20:27:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:57.187 20:27:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:57.187 20:27:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:57.187 20:27:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:57.187 20:27:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:57.445 20:27:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:57.445 [ 00:13:57.445 { 00:13:57.445 "name": "BaseBdev2", 00:13:57.445 "aliases": [ 00:13:57.445 "854a7b9e-33d9-4dd7-8945-667cb425ff9e" 00:13:57.445 ], 00:13:57.445 "product_name": "Malloc disk", 00:13:57.445 "block_size": 512, 00:13:57.445 "num_blocks": 65536, 00:13:57.445 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:13:57.445 "assigned_rate_limits": { 00:13:57.445 "rw_ios_per_sec": 0, 00:13:57.445 "rw_mbytes_per_sec": 0, 00:13:57.445 "r_mbytes_per_sec": 0, 00:13:57.445 "w_mbytes_per_sec": 0 00:13:57.445 }, 00:13:57.445 "claimed": false, 00:13:57.445 "zoned": false, 00:13:57.445 "supported_io_types": { 00:13:57.445 "read": true, 00:13:57.445 "write": true, 00:13:57.445 "unmap": true, 00:13:57.445 "flush": true, 00:13:57.445 "reset": true, 00:13:57.445 "nvme_admin": false, 00:13:57.445 "nvme_io": false, 00:13:57.445 "nvme_io_md": false, 00:13:57.445 "write_zeroes": true, 00:13:57.445 "zcopy": true, 00:13:57.445 "get_zone_info": false, 00:13:57.445 "zone_management": false, 00:13:57.445 "zone_append": false, 00:13:57.445 "compare": false, 00:13:57.445 "compare_and_write": false, 00:13:57.445 "abort": true, 00:13:57.445 "seek_hole": false, 00:13:57.445 "seek_data": false, 00:13:57.445 "copy": true, 00:13:57.445 "nvme_iov_md": false 00:13:57.445 }, 00:13:57.445 "memory_domains": [ 00:13:57.445 { 00:13:57.445 "dma_device_id": "system", 00:13:57.445 "dma_device_type": 1 00:13:57.445 }, 00:13:57.445 { 00:13:57.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.445 "dma_device_type": 2 00:13:57.445 } 00:13:57.445 ], 00:13:57.445 "driver_specific": {} 00:13:57.445 } 00:13:57.445 ] 00:13:57.445 20:27:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:57.445 20:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:57.445 20:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:57.445 20:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:57.704 BaseBdev3 00:13:57.704 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:57.704 20:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:57.704 20:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:57.704 20:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:57.704 20:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:57.704 20:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:57.704 20:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:57.963 20:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:58.222 [ 00:13:58.222 { 00:13:58.222 "name": "BaseBdev3", 00:13:58.222 "aliases": [ 00:13:58.222 "d3a0b304-b69b-4b34-a12f-a37897adcbb7" 00:13:58.222 ], 00:13:58.222 "product_name": "Malloc disk", 00:13:58.222 "block_size": 512, 00:13:58.222 "num_blocks": 65536, 00:13:58.222 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:13:58.222 "assigned_rate_limits": { 00:13:58.222 "rw_ios_per_sec": 0, 00:13:58.222 "rw_mbytes_per_sec": 0, 00:13:58.222 "r_mbytes_per_sec": 0, 00:13:58.222 "w_mbytes_per_sec": 0 00:13:58.222 }, 00:13:58.222 "claimed": false, 00:13:58.222 "zoned": false, 00:13:58.222 "supported_io_types": { 00:13:58.222 "read": true, 00:13:58.222 "write": true, 00:13:58.222 "unmap": true, 00:13:58.222 "flush": true, 00:13:58.222 "reset": true, 00:13:58.222 "nvme_admin": false, 00:13:58.222 "nvme_io": false, 00:13:58.222 "nvme_io_md": false, 00:13:58.222 "write_zeroes": true, 00:13:58.222 "zcopy": true, 00:13:58.222 "get_zone_info": false, 00:13:58.222 "zone_management": false, 00:13:58.222 "zone_append": false, 00:13:58.222 "compare": false, 00:13:58.222 "compare_and_write": false, 00:13:58.222 "abort": true, 00:13:58.222 "seek_hole": false, 00:13:58.222 "seek_data": false, 00:13:58.222 "copy": true, 00:13:58.222 "nvme_iov_md": false 00:13:58.222 }, 00:13:58.222 "memory_domains": [ 00:13:58.222 { 00:13:58.222 "dma_device_id": "system", 00:13:58.222 "dma_device_type": 1 00:13:58.222 }, 00:13:58.222 { 00:13:58.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.222 "dma_device_type": 2 00:13:58.222 } 00:13:58.222 ], 00:13:58.222 "driver_specific": {} 00:13:58.222 } 00:13:58.222 ] 00:13:58.222 20:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:58.222 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:58.222 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:58.222 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:58.481 [2024-07-15 20:27:50.800335] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:58.481 [2024-07-15 20:27:50.800376] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:58.481 [2024-07-15 20:27:50.800395] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:58.481 [2024-07-15 20:27:50.801771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.481 20:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.740 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.740 "name": "Existed_Raid", 00:13:58.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.740 "strip_size_kb": 64, 00:13:58.740 "state": "configuring", 00:13:58.740 "raid_level": "raid0", 00:13:58.740 "superblock": false, 00:13:58.740 "num_base_bdevs": 3, 00:13:58.740 "num_base_bdevs_discovered": 2, 00:13:58.740 "num_base_bdevs_operational": 3, 00:13:58.740 "base_bdevs_list": [ 00:13:58.740 { 00:13:58.740 "name": "BaseBdev1", 00:13:58.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.740 "is_configured": false, 00:13:58.740 "data_offset": 0, 00:13:58.740 "data_size": 0 00:13:58.740 }, 00:13:58.740 { 00:13:58.740 "name": "BaseBdev2", 00:13:58.741 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:13:58.741 "is_configured": true, 00:13:58.741 "data_offset": 0, 00:13:58.741 "data_size": 65536 00:13:58.741 }, 00:13:58.741 { 00:13:58.741 "name": "BaseBdev3", 00:13:58.741 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:13:58.741 "is_configured": true, 00:13:58.741 "data_offset": 0, 00:13:58.741 "data_size": 65536 00:13:58.741 } 00:13:58.741 ] 00:13:58.741 }' 00:13:58.741 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.741 20:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.308 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:59.568 [2024-07-15 20:27:51.839067] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.568 20:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.827 20:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.827 "name": "Existed_Raid", 00:13:59.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.827 "strip_size_kb": 64, 00:13:59.827 "state": "configuring", 00:13:59.827 "raid_level": "raid0", 00:13:59.827 "superblock": false, 00:13:59.827 "num_base_bdevs": 3, 00:13:59.827 "num_base_bdevs_discovered": 1, 00:13:59.827 "num_base_bdevs_operational": 3, 00:13:59.827 "base_bdevs_list": [ 00:13:59.827 { 00:13:59.827 "name": "BaseBdev1", 00:13:59.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.827 "is_configured": false, 00:13:59.827 "data_offset": 0, 00:13:59.827 "data_size": 0 00:13:59.827 }, 00:13:59.827 { 00:13:59.827 "name": null, 00:13:59.827 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:13:59.827 "is_configured": false, 00:13:59.827 "data_offset": 0, 00:13:59.827 "data_size": 65536 00:13:59.827 }, 00:13:59.827 { 00:13:59.827 "name": "BaseBdev3", 00:13:59.827 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:13:59.827 "is_configured": true, 00:13:59.827 "data_offset": 0, 00:13:59.827 "data_size": 65536 00:13:59.827 } 00:13:59.827 ] 00:13:59.827 }' 00:13:59.827 20:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.827 20:27:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:00.395 20:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.395 20:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:00.654 20:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:00.654 20:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:00.913 [2024-07-15 20:27:53.190569] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:00.913 BaseBdev1 00:14:00.913 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:00.913 20:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:00.913 20:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:00.913 20:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:00.913 20:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:00.913 20:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:00.913 20:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:01.171 20:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:01.430 [ 00:14:01.430 { 00:14:01.430 "name": "BaseBdev1", 00:14:01.430 "aliases": [ 00:14:01.430 "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634" 00:14:01.430 ], 00:14:01.430 "product_name": "Malloc disk", 00:14:01.430 "block_size": 512, 00:14:01.430 "num_blocks": 65536, 00:14:01.430 "uuid": "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634", 00:14:01.430 "assigned_rate_limits": { 00:14:01.430 "rw_ios_per_sec": 0, 00:14:01.430 "rw_mbytes_per_sec": 0, 00:14:01.430 "r_mbytes_per_sec": 0, 00:14:01.430 "w_mbytes_per_sec": 0 00:14:01.430 }, 00:14:01.430 "claimed": true, 00:14:01.430 "claim_type": "exclusive_write", 00:14:01.430 "zoned": false, 00:14:01.430 "supported_io_types": { 00:14:01.430 "read": true, 00:14:01.430 "write": true, 00:14:01.430 "unmap": true, 00:14:01.430 "flush": true, 00:14:01.430 "reset": true, 00:14:01.430 "nvme_admin": false, 00:14:01.430 "nvme_io": false, 00:14:01.430 "nvme_io_md": false, 00:14:01.430 "write_zeroes": true, 00:14:01.430 "zcopy": true, 00:14:01.430 "get_zone_info": false, 00:14:01.430 "zone_management": false, 00:14:01.430 "zone_append": false, 00:14:01.430 "compare": false, 00:14:01.430 "compare_and_write": false, 00:14:01.430 "abort": true, 00:14:01.430 "seek_hole": false, 00:14:01.430 "seek_data": false, 00:14:01.430 "copy": true, 00:14:01.430 "nvme_iov_md": false 00:14:01.430 }, 00:14:01.430 "memory_domains": [ 00:14:01.430 { 00:14:01.430 "dma_device_id": "system", 00:14:01.430 "dma_device_type": 1 00:14:01.430 }, 00:14:01.430 { 00:14:01.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.430 "dma_device_type": 2 00:14:01.430 } 00:14:01.430 ], 00:14:01.430 "driver_specific": {} 00:14:01.430 } 00:14:01.430 ] 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.430 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.688 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.688 "name": "Existed_Raid", 00:14:01.688 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.688 "strip_size_kb": 64, 00:14:01.688 "state": "configuring", 00:14:01.688 "raid_level": "raid0", 00:14:01.688 "superblock": false, 00:14:01.688 "num_base_bdevs": 3, 00:14:01.688 "num_base_bdevs_discovered": 2, 00:14:01.688 "num_base_bdevs_operational": 3, 00:14:01.688 "base_bdevs_list": [ 00:14:01.688 { 00:14:01.688 "name": "BaseBdev1", 00:14:01.688 "uuid": "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634", 00:14:01.688 "is_configured": true, 00:14:01.688 "data_offset": 0, 00:14:01.688 "data_size": 65536 00:14:01.688 }, 00:14:01.688 { 00:14:01.688 "name": null, 00:14:01.688 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:14:01.688 "is_configured": false, 00:14:01.688 "data_offset": 0, 00:14:01.688 "data_size": 65536 00:14:01.688 }, 00:14:01.688 { 00:14:01.688 "name": "BaseBdev3", 00:14:01.688 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:14:01.688 "is_configured": true, 00:14:01.688 "data_offset": 0, 00:14:01.688 "data_size": 65536 00:14:01.688 } 00:14:01.688 ] 00:14:01.688 }' 00:14:01.688 20:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.688 20:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:02.256 20:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.256 20:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:02.515 20:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:02.515 20:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:02.775 [2024-07-15 20:27:55.019446] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.775 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.034 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.034 "name": "Existed_Raid", 00:14:03.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.034 "strip_size_kb": 64, 00:14:03.034 "state": "configuring", 00:14:03.034 "raid_level": "raid0", 00:14:03.034 "superblock": false, 00:14:03.034 "num_base_bdevs": 3, 00:14:03.034 "num_base_bdevs_discovered": 1, 00:14:03.034 "num_base_bdevs_operational": 3, 00:14:03.034 "base_bdevs_list": [ 00:14:03.034 { 00:14:03.034 "name": "BaseBdev1", 00:14:03.034 "uuid": "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634", 00:14:03.034 "is_configured": true, 00:14:03.034 "data_offset": 0, 00:14:03.034 "data_size": 65536 00:14:03.034 }, 00:14:03.034 { 00:14:03.034 "name": null, 00:14:03.034 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:14:03.034 "is_configured": false, 00:14:03.034 "data_offset": 0, 00:14:03.034 "data_size": 65536 00:14:03.034 }, 00:14:03.034 { 00:14:03.034 "name": null, 00:14:03.034 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:14:03.034 "is_configured": false, 00:14:03.034 "data_offset": 0, 00:14:03.034 "data_size": 65536 00:14:03.034 } 00:14:03.034 ] 00:14:03.034 }' 00:14:03.034 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.034 20:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.602 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.602 20:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:03.862 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:03.862 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:04.121 [2024-07-15 20:27:56.346985] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.121 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.380 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.380 "name": "Existed_Raid", 00:14:04.380 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:04.380 "strip_size_kb": 64, 00:14:04.380 "state": "configuring", 00:14:04.380 "raid_level": "raid0", 00:14:04.380 "superblock": false, 00:14:04.380 "num_base_bdevs": 3, 00:14:04.380 "num_base_bdevs_discovered": 2, 00:14:04.380 "num_base_bdevs_operational": 3, 00:14:04.380 "base_bdevs_list": [ 00:14:04.380 { 00:14:04.380 "name": "BaseBdev1", 00:14:04.380 "uuid": "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634", 00:14:04.380 "is_configured": true, 00:14:04.380 "data_offset": 0, 00:14:04.380 "data_size": 65536 00:14:04.380 }, 00:14:04.380 { 00:14:04.380 "name": null, 00:14:04.380 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:14:04.380 "is_configured": false, 00:14:04.380 "data_offset": 0, 00:14:04.380 "data_size": 65536 00:14:04.380 }, 00:14:04.380 { 00:14:04.380 "name": "BaseBdev3", 00:14:04.380 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:14:04.380 "is_configured": true, 00:14:04.380 "data_offset": 0, 00:14:04.380 "data_size": 65536 00:14:04.380 } 00:14:04.380 ] 00:14:04.380 }' 00:14:04.380 20:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.380 20:27:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:04.947 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.947 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:05.206 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:05.206 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:05.465 [2024-07-15 20:27:57.682532] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.465 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.724 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.724 "name": "Existed_Raid", 00:14:05.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:05.724 "strip_size_kb": 64, 00:14:05.724 "state": "configuring", 00:14:05.724 "raid_level": "raid0", 00:14:05.724 "superblock": false, 00:14:05.724 "num_base_bdevs": 3, 00:14:05.724 "num_base_bdevs_discovered": 1, 00:14:05.724 "num_base_bdevs_operational": 3, 00:14:05.724 "base_bdevs_list": [ 00:14:05.724 { 00:14:05.724 "name": null, 00:14:05.724 "uuid": "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634", 00:14:05.724 "is_configured": false, 00:14:05.724 "data_offset": 0, 00:14:05.724 "data_size": 65536 00:14:05.724 }, 00:14:05.724 { 00:14:05.724 "name": null, 00:14:05.724 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:14:05.724 "is_configured": false, 00:14:05.724 "data_offset": 0, 00:14:05.724 "data_size": 65536 00:14:05.724 }, 00:14:05.724 { 00:14:05.724 "name": "BaseBdev3", 00:14:05.724 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:14:05.724 "is_configured": true, 00:14:05.724 "data_offset": 0, 00:14:05.724 "data_size": 65536 00:14:05.724 } 00:14:05.724 ] 00:14:05.724 }' 00:14:05.724 20:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.724 20:27:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.291 20:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.291 20:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:06.548 20:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:06.548 20:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:06.806 [2024-07-15 20:27:59.058566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.806 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.064 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.064 "name": "Existed_Raid", 00:14:07.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.064 "strip_size_kb": 64, 00:14:07.064 "state": "configuring", 00:14:07.064 "raid_level": "raid0", 00:14:07.064 "superblock": false, 00:14:07.064 "num_base_bdevs": 3, 00:14:07.064 "num_base_bdevs_discovered": 2, 00:14:07.064 "num_base_bdevs_operational": 3, 00:14:07.064 "base_bdevs_list": [ 00:14:07.064 { 00:14:07.064 "name": null, 00:14:07.064 "uuid": "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634", 00:14:07.064 "is_configured": false, 00:14:07.064 "data_offset": 0, 00:14:07.064 "data_size": 65536 00:14:07.064 }, 00:14:07.064 { 00:14:07.064 "name": "BaseBdev2", 00:14:07.064 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:14:07.064 "is_configured": true, 00:14:07.064 "data_offset": 0, 00:14:07.064 "data_size": 65536 00:14:07.064 }, 00:14:07.064 { 00:14:07.064 "name": "BaseBdev3", 00:14:07.064 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:14:07.064 "is_configured": true, 00:14:07.064 "data_offset": 0, 00:14:07.064 "data_size": 65536 00:14:07.064 } 00:14:07.064 ] 00:14:07.064 }' 00:14:07.064 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.064 20:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:07.631 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.631 20:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:07.889 20:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:07.889 20:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.889 20:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:08.148 20:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u cabe3a13-47ac-4e2a-a5c0-5e941f7b2634 00:14:08.407 [2024-07-15 20:28:00.647305] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:08.407 [2024-07-15 20:28:00.647344] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd9e450 00:14:08.407 [2024-07-15 20:28:00.647353] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:08.407 [2024-07-15 20:28:00.647545] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd9fa50 00:14:08.407 [2024-07-15 20:28:00.647658] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd9e450 00:14:08.407 [2024-07-15 20:28:00.647668] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd9e450 00:14:08.407 [2024-07-15 20:28:00.647834] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:08.407 NewBaseBdev 00:14:08.407 20:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:08.407 20:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:08.407 20:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:08.407 20:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:08.407 20:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:08.407 20:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:08.407 20:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:08.666 20:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:08.925 [ 00:14:08.925 { 00:14:08.925 "name": "NewBaseBdev", 00:14:08.925 "aliases": [ 00:14:08.925 "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634" 00:14:08.925 ], 00:14:08.925 "product_name": "Malloc disk", 00:14:08.925 "block_size": 512, 00:14:08.925 "num_blocks": 65536, 00:14:08.925 "uuid": "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634", 00:14:08.925 "assigned_rate_limits": { 00:14:08.925 "rw_ios_per_sec": 0, 00:14:08.925 "rw_mbytes_per_sec": 0, 00:14:08.925 "r_mbytes_per_sec": 0, 00:14:08.925 "w_mbytes_per_sec": 0 00:14:08.925 }, 00:14:08.925 "claimed": true, 00:14:08.925 "claim_type": "exclusive_write", 00:14:08.925 "zoned": false, 00:14:08.925 "supported_io_types": { 00:14:08.925 "read": true, 00:14:08.925 "write": true, 00:14:08.925 "unmap": true, 00:14:08.925 "flush": true, 00:14:08.925 "reset": true, 00:14:08.925 "nvme_admin": false, 00:14:08.925 "nvme_io": false, 00:14:08.925 "nvme_io_md": false, 00:14:08.925 "write_zeroes": true, 00:14:08.925 "zcopy": true, 00:14:08.925 "get_zone_info": false, 00:14:08.925 "zone_management": false, 00:14:08.925 "zone_append": false, 00:14:08.925 "compare": false, 00:14:08.925 "compare_and_write": false, 00:14:08.925 "abort": true, 00:14:08.925 "seek_hole": false, 00:14:08.925 "seek_data": false, 00:14:08.925 "copy": true, 00:14:08.925 "nvme_iov_md": false 00:14:08.925 }, 00:14:08.925 "memory_domains": [ 00:14:08.925 { 00:14:08.925 "dma_device_id": "system", 00:14:08.925 "dma_device_type": 1 00:14:08.925 }, 00:14:08.925 { 00:14:08.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.925 "dma_device_type": 2 00:14:08.925 } 00:14:08.925 ], 00:14:08.925 "driver_specific": {} 00:14:08.925 } 00:14:08.925 ] 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.925 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:09.183 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.183 "name": "Existed_Raid", 00:14:09.183 "uuid": "0b50e713-be34-4e72-a1a0-0915147516b6", 00:14:09.183 "strip_size_kb": 64, 00:14:09.183 "state": "online", 00:14:09.183 "raid_level": "raid0", 00:14:09.183 "superblock": false, 00:14:09.183 "num_base_bdevs": 3, 00:14:09.183 "num_base_bdevs_discovered": 3, 00:14:09.183 "num_base_bdevs_operational": 3, 00:14:09.183 "base_bdevs_list": [ 00:14:09.183 { 00:14:09.183 "name": "NewBaseBdev", 00:14:09.183 "uuid": "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634", 00:14:09.183 "is_configured": true, 00:14:09.183 "data_offset": 0, 00:14:09.183 "data_size": 65536 00:14:09.183 }, 00:14:09.183 { 00:14:09.183 "name": "BaseBdev2", 00:14:09.183 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:14:09.183 "is_configured": true, 00:14:09.183 "data_offset": 0, 00:14:09.183 "data_size": 65536 00:14:09.183 }, 00:14:09.183 { 00:14:09.183 "name": "BaseBdev3", 00:14:09.183 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:14:09.183 "is_configured": true, 00:14:09.183 "data_offset": 0, 00:14:09.183 "data_size": 65536 00:14:09.183 } 00:14:09.183 ] 00:14:09.183 }' 00:14:09.183 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.183 20:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:09.751 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:09.751 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:09.751 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:09.751 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:09.751 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:09.751 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:09.751 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:09.751 20:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:10.010 [2024-07-15 20:28:02.143551] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:10.010 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:10.010 "name": "Existed_Raid", 00:14:10.010 "aliases": [ 00:14:10.010 "0b50e713-be34-4e72-a1a0-0915147516b6" 00:14:10.010 ], 00:14:10.010 "product_name": "Raid Volume", 00:14:10.010 "block_size": 512, 00:14:10.010 "num_blocks": 196608, 00:14:10.010 "uuid": "0b50e713-be34-4e72-a1a0-0915147516b6", 00:14:10.010 "assigned_rate_limits": { 00:14:10.010 "rw_ios_per_sec": 0, 00:14:10.010 "rw_mbytes_per_sec": 0, 00:14:10.010 "r_mbytes_per_sec": 0, 00:14:10.010 "w_mbytes_per_sec": 0 00:14:10.010 }, 00:14:10.010 "claimed": false, 00:14:10.010 "zoned": false, 00:14:10.010 "supported_io_types": { 00:14:10.010 "read": true, 00:14:10.010 "write": true, 00:14:10.010 "unmap": true, 00:14:10.010 "flush": true, 00:14:10.010 "reset": true, 00:14:10.010 "nvme_admin": false, 00:14:10.010 "nvme_io": false, 00:14:10.010 "nvme_io_md": false, 00:14:10.010 "write_zeroes": true, 00:14:10.010 "zcopy": false, 00:14:10.010 "get_zone_info": false, 00:14:10.010 "zone_management": false, 00:14:10.010 "zone_append": false, 00:14:10.010 "compare": false, 00:14:10.010 "compare_and_write": false, 00:14:10.010 "abort": false, 00:14:10.010 "seek_hole": false, 00:14:10.010 "seek_data": false, 00:14:10.010 "copy": false, 00:14:10.010 "nvme_iov_md": false 00:14:10.010 }, 00:14:10.010 "memory_domains": [ 00:14:10.010 { 00:14:10.010 "dma_device_id": "system", 00:14:10.010 "dma_device_type": 1 00:14:10.010 }, 00:14:10.010 { 00:14:10.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.010 "dma_device_type": 2 00:14:10.010 }, 00:14:10.010 { 00:14:10.010 "dma_device_id": "system", 00:14:10.010 "dma_device_type": 1 00:14:10.010 }, 00:14:10.010 { 00:14:10.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.010 "dma_device_type": 2 00:14:10.010 }, 00:14:10.010 { 00:14:10.010 "dma_device_id": "system", 00:14:10.010 "dma_device_type": 1 00:14:10.010 }, 00:14:10.010 { 00:14:10.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.010 "dma_device_type": 2 00:14:10.010 } 00:14:10.010 ], 00:14:10.010 "driver_specific": { 00:14:10.010 "raid": { 00:14:10.010 "uuid": "0b50e713-be34-4e72-a1a0-0915147516b6", 00:14:10.010 "strip_size_kb": 64, 00:14:10.010 "state": "online", 00:14:10.010 "raid_level": "raid0", 00:14:10.010 "superblock": false, 00:14:10.010 "num_base_bdevs": 3, 00:14:10.010 "num_base_bdevs_discovered": 3, 00:14:10.010 "num_base_bdevs_operational": 3, 00:14:10.010 "base_bdevs_list": [ 00:14:10.010 { 00:14:10.010 "name": "NewBaseBdev", 00:14:10.010 "uuid": "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634", 00:14:10.010 "is_configured": true, 00:14:10.010 "data_offset": 0, 00:14:10.010 "data_size": 65536 00:14:10.010 }, 00:14:10.010 { 00:14:10.010 "name": "BaseBdev2", 00:14:10.010 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:14:10.010 "is_configured": true, 00:14:10.010 "data_offset": 0, 00:14:10.010 "data_size": 65536 00:14:10.010 }, 00:14:10.010 { 00:14:10.010 "name": "BaseBdev3", 00:14:10.010 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:14:10.010 "is_configured": true, 00:14:10.010 "data_offset": 0, 00:14:10.010 "data_size": 65536 00:14:10.010 } 00:14:10.010 ] 00:14:10.010 } 00:14:10.010 } 00:14:10.010 }' 00:14:10.010 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:10.010 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:10.010 BaseBdev2 00:14:10.010 BaseBdev3' 00:14:10.010 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:10.010 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:10.010 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:10.268 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:10.268 "name": "NewBaseBdev", 00:14:10.268 "aliases": [ 00:14:10.268 "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634" 00:14:10.268 ], 00:14:10.268 "product_name": "Malloc disk", 00:14:10.268 "block_size": 512, 00:14:10.268 "num_blocks": 65536, 00:14:10.268 "uuid": "cabe3a13-47ac-4e2a-a5c0-5e941f7b2634", 00:14:10.268 "assigned_rate_limits": { 00:14:10.268 "rw_ios_per_sec": 0, 00:14:10.268 "rw_mbytes_per_sec": 0, 00:14:10.268 "r_mbytes_per_sec": 0, 00:14:10.268 "w_mbytes_per_sec": 0 00:14:10.268 }, 00:14:10.268 "claimed": true, 00:14:10.268 "claim_type": "exclusive_write", 00:14:10.268 "zoned": false, 00:14:10.268 "supported_io_types": { 00:14:10.268 "read": true, 00:14:10.268 "write": true, 00:14:10.268 "unmap": true, 00:14:10.268 "flush": true, 00:14:10.268 "reset": true, 00:14:10.268 "nvme_admin": false, 00:14:10.268 "nvme_io": false, 00:14:10.268 "nvme_io_md": false, 00:14:10.268 "write_zeroes": true, 00:14:10.268 "zcopy": true, 00:14:10.268 "get_zone_info": false, 00:14:10.268 "zone_management": false, 00:14:10.268 "zone_append": false, 00:14:10.268 "compare": false, 00:14:10.268 "compare_and_write": false, 00:14:10.268 "abort": true, 00:14:10.268 "seek_hole": false, 00:14:10.268 "seek_data": false, 00:14:10.268 "copy": true, 00:14:10.268 "nvme_iov_md": false 00:14:10.268 }, 00:14:10.268 "memory_domains": [ 00:14:10.268 { 00:14:10.268 "dma_device_id": "system", 00:14:10.268 "dma_device_type": 1 00:14:10.268 }, 00:14:10.268 { 00:14:10.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.268 "dma_device_type": 2 00:14:10.268 } 00:14:10.268 ], 00:14:10.268 "driver_specific": {} 00:14:10.268 }' 00:14:10.268 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:10.268 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:10.268 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:10.268 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:10.268 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:10.268 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:10.268 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:10.527 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:10.527 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:10.527 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.527 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:10.527 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:10.527 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:10.527 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:10.527 20:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:10.786 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:10.786 "name": "BaseBdev2", 00:14:10.786 "aliases": [ 00:14:10.786 "854a7b9e-33d9-4dd7-8945-667cb425ff9e" 00:14:10.786 ], 00:14:10.786 "product_name": "Malloc disk", 00:14:10.786 "block_size": 512, 00:14:10.786 "num_blocks": 65536, 00:14:10.786 "uuid": "854a7b9e-33d9-4dd7-8945-667cb425ff9e", 00:14:10.786 "assigned_rate_limits": { 00:14:10.786 "rw_ios_per_sec": 0, 00:14:10.786 "rw_mbytes_per_sec": 0, 00:14:10.786 "r_mbytes_per_sec": 0, 00:14:10.786 "w_mbytes_per_sec": 0 00:14:10.786 }, 00:14:10.786 "claimed": true, 00:14:10.786 "claim_type": "exclusive_write", 00:14:10.786 "zoned": false, 00:14:10.786 "supported_io_types": { 00:14:10.786 "read": true, 00:14:10.786 "write": true, 00:14:10.786 "unmap": true, 00:14:10.786 "flush": true, 00:14:10.786 "reset": true, 00:14:10.786 "nvme_admin": false, 00:14:10.786 "nvme_io": false, 00:14:10.786 "nvme_io_md": false, 00:14:10.786 "write_zeroes": true, 00:14:10.786 "zcopy": true, 00:14:10.786 "get_zone_info": false, 00:14:10.786 "zone_management": false, 00:14:10.786 "zone_append": false, 00:14:10.786 "compare": false, 00:14:10.786 "compare_and_write": false, 00:14:10.786 "abort": true, 00:14:10.786 "seek_hole": false, 00:14:10.786 "seek_data": false, 00:14:10.786 "copy": true, 00:14:10.786 "nvme_iov_md": false 00:14:10.786 }, 00:14:10.786 "memory_domains": [ 00:14:10.786 { 00:14:10.786 "dma_device_id": "system", 00:14:10.786 "dma_device_type": 1 00:14:10.786 }, 00:14:10.786 { 00:14:10.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:10.786 "dma_device_type": 2 00:14:10.786 } 00:14:10.786 ], 00:14:10.786 "driver_specific": {} 00:14:10.786 }' 00:14:10.786 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:10.786 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:10.786 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:10.786 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:10.786 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:11.044 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:11.044 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:11.044 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:11.044 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:11.044 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:11.044 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:11.044 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:11.044 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:11.044 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:11.044 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:11.302 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:11.302 "name": "BaseBdev3", 00:14:11.302 "aliases": [ 00:14:11.302 "d3a0b304-b69b-4b34-a12f-a37897adcbb7" 00:14:11.302 ], 00:14:11.302 "product_name": "Malloc disk", 00:14:11.302 "block_size": 512, 00:14:11.302 "num_blocks": 65536, 00:14:11.302 "uuid": "d3a0b304-b69b-4b34-a12f-a37897adcbb7", 00:14:11.302 "assigned_rate_limits": { 00:14:11.302 "rw_ios_per_sec": 0, 00:14:11.302 "rw_mbytes_per_sec": 0, 00:14:11.302 "r_mbytes_per_sec": 0, 00:14:11.302 "w_mbytes_per_sec": 0 00:14:11.302 }, 00:14:11.302 "claimed": true, 00:14:11.302 "claim_type": "exclusive_write", 00:14:11.302 "zoned": false, 00:14:11.302 "supported_io_types": { 00:14:11.302 "read": true, 00:14:11.302 "write": true, 00:14:11.302 "unmap": true, 00:14:11.302 "flush": true, 00:14:11.302 "reset": true, 00:14:11.302 "nvme_admin": false, 00:14:11.302 "nvme_io": false, 00:14:11.302 "nvme_io_md": false, 00:14:11.302 "write_zeroes": true, 00:14:11.302 "zcopy": true, 00:14:11.302 "get_zone_info": false, 00:14:11.302 "zone_management": false, 00:14:11.302 "zone_append": false, 00:14:11.302 "compare": false, 00:14:11.302 "compare_and_write": false, 00:14:11.302 "abort": true, 00:14:11.302 "seek_hole": false, 00:14:11.302 "seek_data": false, 00:14:11.302 "copy": true, 00:14:11.302 "nvme_iov_md": false 00:14:11.302 }, 00:14:11.302 "memory_domains": [ 00:14:11.302 { 00:14:11.302 "dma_device_id": "system", 00:14:11.302 "dma_device_type": 1 00:14:11.302 }, 00:14:11.302 { 00:14:11.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.302 "dma_device_type": 2 00:14:11.302 } 00:14:11.302 ], 00:14:11.302 "driver_specific": {} 00:14:11.302 }' 00:14:11.303 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:11.303 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:11.560 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:11.560 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:11.560 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:11.560 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:11.560 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:11.560 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:11.560 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:11.560 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:11.560 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:11.818 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:11.818 20:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:11.818 [2024-07-15 20:28:04.176662] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:11.818 [2024-07-15 20:28:04.176687] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:11.818 [2024-07-15 20:28:04.176739] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:11.818 [2024-07-15 20:28:04.176789] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:11.818 [2024-07-15 20:28:04.176800] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd9e450 name Existed_Raid, state offline 00:14:11.818 20:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1370505 00:14:11.818 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1370505 ']' 00:14:11.818 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1370505 00:14:11.818 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:12.108 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:12.109 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1370505 00:14:12.109 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:12.109 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:12.109 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1370505' 00:14:12.109 killing process with pid 1370505 00:14:12.109 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1370505 00:14:12.109 [2024-07-15 20:28:04.244575] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:12.109 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1370505 00:14:12.109 [2024-07-15 20:28:04.271146] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:12.109 20:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:12.109 00:14:12.109 real 0m28.700s 00:14:12.109 user 0m52.620s 00:14:12.109 sys 0m5.183s 00:14:12.109 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:12.109 20:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.109 ************************************ 00:14:12.109 END TEST raid_state_function_test 00:14:12.109 ************************************ 00:14:12.368 20:28:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:12.368 20:28:04 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:14:12.368 20:28:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:12.368 20:28:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:12.368 20:28:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:12.368 ************************************ 00:14:12.368 START TEST raid_state_function_test_sb 00:14:12.368 ************************************ 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1374920 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1374920' 00:14:12.368 Process raid pid: 1374920 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1374920 /var/tmp/spdk-raid.sock 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1374920 ']' 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:12.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:12.368 20:28:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:12.368 [2024-07-15 20:28:04.627590] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:14:12.368 [2024-07-15 20:28:04.627654] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:12.626 [2024-07-15 20:28:04.757893] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:12.626 [2024-07-15 20:28:04.854875] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:12.626 [2024-07-15 20:28:04.919006] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:12.626 [2024-07-15 20:28:04.919042] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:13.193 20:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:13.193 20:28:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:13.193 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:13.451 [2024-07-15 20:28:05.781951] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:13.451 [2024-07-15 20:28:05.781993] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:13.451 [2024-07-15 20:28:05.782004] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:13.451 [2024-07-15 20:28:05.782017] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:13.451 [2024-07-15 20:28:05.782026] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:13.451 [2024-07-15 20:28:05.782037] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.451 20:28:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.709 20:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.709 "name": "Existed_Raid", 00:14:13.709 "uuid": "eca18687-9b82-47d1-8e35-b24369e1194a", 00:14:13.709 "strip_size_kb": 64, 00:14:13.709 "state": "configuring", 00:14:13.709 "raid_level": "raid0", 00:14:13.709 "superblock": true, 00:14:13.709 "num_base_bdevs": 3, 00:14:13.709 "num_base_bdevs_discovered": 0, 00:14:13.709 "num_base_bdevs_operational": 3, 00:14:13.709 "base_bdevs_list": [ 00:14:13.709 { 00:14:13.709 "name": "BaseBdev1", 00:14:13.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.709 "is_configured": false, 00:14:13.709 "data_offset": 0, 00:14:13.709 "data_size": 0 00:14:13.709 }, 00:14:13.709 { 00:14:13.709 "name": "BaseBdev2", 00:14:13.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.709 "is_configured": false, 00:14:13.709 "data_offset": 0, 00:14:13.709 "data_size": 0 00:14:13.709 }, 00:14:13.709 { 00:14:13.709 "name": "BaseBdev3", 00:14:13.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.709 "is_configured": false, 00:14:13.709 "data_offset": 0, 00:14:13.709 "data_size": 0 00:14:13.709 } 00:14:13.709 ] 00:14:13.709 }' 00:14:13.709 20:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.709 20:28:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:14.276 20:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:14.537 [2024-07-15 20:28:06.856648] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:14.537 [2024-07-15 20:28:06.856676] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1486a80 name Existed_Raid, state configuring 00:14:14.537 20:28:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:14.867 [2024-07-15 20:28:07.101330] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:14.867 [2024-07-15 20:28:07.101360] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:14.867 [2024-07-15 20:28:07.101370] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:14.867 [2024-07-15 20:28:07.101382] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:14.867 [2024-07-15 20:28:07.101390] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:14.867 [2024-07-15 20:28:07.101402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:14.867 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:15.127 [2024-07-15 20:28:07.351661] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:15.127 BaseBdev1 00:14:15.127 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:15.127 20:28:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:15.127 20:28:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:15.127 20:28:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:15.127 20:28:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:15.127 20:28:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:15.127 20:28:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:15.386 20:28:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:15.645 [ 00:14:15.645 { 00:14:15.645 "name": "BaseBdev1", 00:14:15.645 "aliases": [ 00:14:15.645 "da62073c-dbad-43d9-8bc0-1212f89059e3" 00:14:15.645 ], 00:14:15.645 "product_name": "Malloc disk", 00:14:15.645 "block_size": 512, 00:14:15.645 "num_blocks": 65536, 00:14:15.645 "uuid": "da62073c-dbad-43d9-8bc0-1212f89059e3", 00:14:15.645 "assigned_rate_limits": { 00:14:15.645 "rw_ios_per_sec": 0, 00:14:15.645 "rw_mbytes_per_sec": 0, 00:14:15.645 "r_mbytes_per_sec": 0, 00:14:15.645 "w_mbytes_per_sec": 0 00:14:15.645 }, 00:14:15.645 "claimed": true, 00:14:15.645 "claim_type": "exclusive_write", 00:14:15.645 "zoned": false, 00:14:15.645 "supported_io_types": { 00:14:15.645 "read": true, 00:14:15.645 "write": true, 00:14:15.645 "unmap": true, 00:14:15.645 "flush": true, 00:14:15.645 "reset": true, 00:14:15.645 "nvme_admin": false, 00:14:15.645 "nvme_io": false, 00:14:15.645 "nvme_io_md": false, 00:14:15.645 "write_zeroes": true, 00:14:15.645 "zcopy": true, 00:14:15.645 "get_zone_info": false, 00:14:15.645 "zone_management": false, 00:14:15.645 "zone_append": false, 00:14:15.645 "compare": false, 00:14:15.645 "compare_and_write": false, 00:14:15.645 "abort": true, 00:14:15.645 "seek_hole": false, 00:14:15.645 "seek_data": false, 00:14:15.645 "copy": true, 00:14:15.645 "nvme_iov_md": false 00:14:15.645 }, 00:14:15.645 "memory_domains": [ 00:14:15.645 { 00:14:15.645 "dma_device_id": "system", 00:14:15.645 "dma_device_type": 1 00:14:15.645 }, 00:14:15.645 { 00:14:15.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.645 "dma_device_type": 2 00:14:15.645 } 00:14:15.645 ], 00:14:15.645 "driver_specific": {} 00:14:15.645 } 00:14:15.645 ] 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.645 20:28:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.904 20:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.905 "name": "Existed_Raid", 00:14:15.905 "uuid": "b7fb028d-7bcd-4a8d-8efc-e0e855cfcb80", 00:14:15.905 "strip_size_kb": 64, 00:14:15.905 "state": "configuring", 00:14:15.905 "raid_level": "raid0", 00:14:15.905 "superblock": true, 00:14:15.905 "num_base_bdevs": 3, 00:14:15.905 "num_base_bdevs_discovered": 1, 00:14:15.905 "num_base_bdevs_operational": 3, 00:14:15.905 "base_bdevs_list": [ 00:14:15.905 { 00:14:15.905 "name": "BaseBdev1", 00:14:15.905 "uuid": "da62073c-dbad-43d9-8bc0-1212f89059e3", 00:14:15.905 "is_configured": true, 00:14:15.905 "data_offset": 2048, 00:14:15.905 "data_size": 63488 00:14:15.905 }, 00:14:15.905 { 00:14:15.905 "name": "BaseBdev2", 00:14:15.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.905 "is_configured": false, 00:14:15.905 "data_offset": 0, 00:14:15.905 "data_size": 0 00:14:15.905 }, 00:14:15.905 { 00:14:15.905 "name": "BaseBdev3", 00:14:15.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.905 "is_configured": false, 00:14:15.905 "data_offset": 0, 00:14:15.905 "data_size": 0 00:14:15.905 } 00:14:15.905 ] 00:14:15.905 }' 00:14:15.905 20:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.905 20:28:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:16.473 20:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:16.732 [2024-07-15 20:28:08.871697] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:16.732 [2024-07-15 20:28:08.871735] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1486310 name Existed_Raid, state configuring 00:14:16.732 20:28:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:16.991 [2024-07-15 20:28:09.120399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:16.991 [2024-07-15 20:28:09.121955] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:16.991 [2024-07-15 20:28:09.121987] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:16.991 [2024-07-15 20:28:09.121997] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:16.991 [2024-07-15 20:28:09.122009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.991 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:17.250 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.250 "name": "Existed_Raid", 00:14:17.250 "uuid": "1f0bd309-3d6b-45ad-be40-09475f1030da", 00:14:17.250 "strip_size_kb": 64, 00:14:17.250 "state": "configuring", 00:14:17.250 "raid_level": "raid0", 00:14:17.250 "superblock": true, 00:14:17.250 "num_base_bdevs": 3, 00:14:17.250 "num_base_bdevs_discovered": 1, 00:14:17.250 "num_base_bdevs_operational": 3, 00:14:17.250 "base_bdevs_list": [ 00:14:17.250 { 00:14:17.250 "name": "BaseBdev1", 00:14:17.250 "uuid": "da62073c-dbad-43d9-8bc0-1212f89059e3", 00:14:17.250 "is_configured": true, 00:14:17.250 "data_offset": 2048, 00:14:17.250 "data_size": 63488 00:14:17.250 }, 00:14:17.250 { 00:14:17.250 "name": "BaseBdev2", 00:14:17.250 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.250 "is_configured": false, 00:14:17.250 "data_offset": 0, 00:14:17.250 "data_size": 0 00:14:17.250 }, 00:14:17.250 { 00:14:17.250 "name": "BaseBdev3", 00:14:17.250 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.250 "is_configured": false, 00:14:17.250 "data_offset": 0, 00:14:17.250 "data_size": 0 00:14:17.250 } 00:14:17.250 ] 00:14:17.250 }' 00:14:17.250 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.250 20:28:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:17.509 20:28:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:17.768 [2024-07-15 20:28:10.110474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:17.768 BaseBdev2 00:14:17.768 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:17.768 20:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:17.768 20:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:17.768 20:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:17.768 20:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:17.768 20:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:17.768 20:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:18.027 20:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:18.287 [ 00:14:18.287 { 00:14:18.287 "name": "BaseBdev2", 00:14:18.287 "aliases": [ 00:14:18.287 "9bb281e2-d725-4869-bcaa-58a9185e84f3" 00:14:18.287 ], 00:14:18.287 "product_name": "Malloc disk", 00:14:18.287 "block_size": 512, 00:14:18.287 "num_blocks": 65536, 00:14:18.287 "uuid": "9bb281e2-d725-4869-bcaa-58a9185e84f3", 00:14:18.287 "assigned_rate_limits": { 00:14:18.287 "rw_ios_per_sec": 0, 00:14:18.287 "rw_mbytes_per_sec": 0, 00:14:18.287 "r_mbytes_per_sec": 0, 00:14:18.287 "w_mbytes_per_sec": 0 00:14:18.287 }, 00:14:18.287 "claimed": true, 00:14:18.287 "claim_type": "exclusive_write", 00:14:18.287 "zoned": false, 00:14:18.287 "supported_io_types": { 00:14:18.287 "read": true, 00:14:18.287 "write": true, 00:14:18.287 "unmap": true, 00:14:18.287 "flush": true, 00:14:18.287 "reset": true, 00:14:18.287 "nvme_admin": false, 00:14:18.287 "nvme_io": false, 00:14:18.287 "nvme_io_md": false, 00:14:18.287 "write_zeroes": true, 00:14:18.287 "zcopy": true, 00:14:18.287 "get_zone_info": false, 00:14:18.287 "zone_management": false, 00:14:18.287 "zone_append": false, 00:14:18.287 "compare": false, 00:14:18.287 "compare_and_write": false, 00:14:18.287 "abort": true, 00:14:18.287 "seek_hole": false, 00:14:18.287 "seek_data": false, 00:14:18.287 "copy": true, 00:14:18.287 "nvme_iov_md": false 00:14:18.287 }, 00:14:18.287 "memory_domains": [ 00:14:18.287 { 00:14:18.287 "dma_device_id": "system", 00:14:18.287 "dma_device_type": 1 00:14:18.287 }, 00:14:18.287 { 00:14:18.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.287 "dma_device_type": 2 00:14:18.287 } 00:14:18.287 ], 00:14:18.287 "driver_specific": {} 00:14:18.287 } 00:14:18.287 ] 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.287 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:18.550 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:18.550 "name": "Existed_Raid", 00:14:18.550 "uuid": "1f0bd309-3d6b-45ad-be40-09475f1030da", 00:14:18.550 "strip_size_kb": 64, 00:14:18.550 "state": "configuring", 00:14:18.550 "raid_level": "raid0", 00:14:18.550 "superblock": true, 00:14:18.550 "num_base_bdevs": 3, 00:14:18.550 "num_base_bdevs_discovered": 2, 00:14:18.550 "num_base_bdevs_operational": 3, 00:14:18.550 "base_bdevs_list": [ 00:14:18.550 { 00:14:18.550 "name": "BaseBdev1", 00:14:18.550 "uuid": "da62073c-dbad-43d9-8bc0-1212f89059e3", 00:14:18.550 "is_configured": true, 00:14:18.550 "data_offset": 2048, 00:14:18.550 "data_size": 63488 00:14:18.550 }, 00:14:18.550 { 00:14:18.550 "name": "BaseBdev2", 00:14:18.550 "uuid": "9bb281e2-d725-4869-bcaa-58a9185e84f3", 00:14:18.550 "is_configured": true, 00:14:18.550 "data_offset": 2048, 00:14:18.550 "data_size": 63488 00:14:18.550 }, 00:14:18.550 { 00:14:18.550 "name": "BaseBdev3", 00:14:18.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:18.550 "is_configured": false, 00:14:18.550 "data_offset": 0, 00:14:18.550 "data_size": 0 00:14:18.550 } 00:14:18.550 ] 00:14:18.550 }' 00:14:18.550 20:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:18.551 20:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:19.118 20:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:19.384 [2024-07-15 20:28:11.714147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:19.384 [2024-07-15 20:28:11.714305] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1487400 00:14:19.384 [2024-07-15 20:28:11.714320] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:19.384 [2024-07-15 20:28:11.714490] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1486ef0 00:14:19.384 [2024-07-15 20:28:11.714602] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1487400 00:14:19.384 [2024-07-15 20:28:11.714612] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1487400 00:14:19.384 [2024-07-15 20:28:11.714702] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:19.384 BaseBdev3 00:14:19.384 20:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:19.384 20:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:19.384 20:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:19.384 20:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:19.384 20:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:19.384 20:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:19.384 20:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:19.644 20:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:19.903 [ 00:14:19.903 { 00:14:19.903 "name": "BaseBdev3", 00:14:19.903 "aliases": [ 00:14:19.903 "6e3f9ad9-4ba2-4250-a866-5ff360f2e205" 00:14:19.903 ], 00:14:19.903 "product_name": "Malloc disk", 00:14:19.903 "block_size": 512, 00:14:19.903 "num_blocks": 65536, 00:14:19.903 "uuid": "6e3f9ad9-4ba2-4250-a866-5ff360f2e205", 00:14:19.903 "assigned_rate_limits": { 00:14:19.903 "rw_ios_per_sec": 0, 00:14:19.903 "rw_mbytes_per_sec": 0, 00:14:19.903 "r_mbytes_per_sec": 0, 00:14:19.903 "w_mbytes_per_sec": 0 00:14:19.903 }, 00:14:19.903 "claimed": true, 00:14:19.903 "claim_type": "exclusive_write", 00:14:19.903 "zoned": false, 00:14:19.903 "supported_io_types": { 00:14:19.903 "read": true, 00:14:19.903 "write": true, 00:14:19.903 "unmap": true, 00:14:19.903 "flush": true, 00:14:19.903 "reset": true, 00:14:19.903 "nvme_admin": false, 00:14:19.903 "nvme_io": false, 00:14:19.903 "nvme_io_md": false, 00:14:19.903 "write_zeroes": true, 00:14:19.903 "zcopy": true, 00:14:19.903 "get_zone_info": false, 00:14:19.903 "zone_management": false, 00:14:19.903 "zone_append": false, 00:14:19.903 "compare": false, 00:14:19.903 "compare_and_write": false, 00:14:19.903 "abort": true, 00:14:19.903 "seek_hole": false, 00:14:19.903 "seek_data": false, 00:14:19.903 "copy": true, 00:14:19.903 "nvme_iov_md": false 00:14:19.903 }, 00:14:19.903 "memory_domains": [ 00:14:19.903 { 00:14:19.903 "dma_device_id": "system", 00:14:19.903 "dma_device_type": 1 00:14:19.903 }, 00:14:19.903 { 00:14:19.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.903 "dma_device_type": 2 00:14:19.903 } 00:14:19.903 ], 00:14:19.903 "driver_specific": {} 00:14:19.903 } 00:14:19.903 ] 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.903 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:20.162 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.162 "name": "Existed_Raid", 00:14:20.162 "uuid": "1f0bd309-3d6b-45ad-be40-09475f1030da", 00:14:20.162 "strip_size_kb": 64, 00:14:20.162 "state": "online", 00:14:20.162 "raid_level": "raid0", 00:14:20.162 "superblock": true, 00:14:20.162 "num_base_bdevs": 3, 00:14:20.162 "num_base_bdevs_discovered": 3, 00:14:20.162 "num_base_bdevs_operational": 3, 00:14:20.162 "base_bdevs_list": [ 00:14:20.162 { 00:14:20.162 "name": "BaseBdev1", 00:14:20.162 "uuid": "da62073c-dbad-43d9-8bc0-1212f89059e3", 00:14:20.162 "is_configured": true, 00:14:20.162 "data_offset": 2048, 00:14:20.162 "data_size": 63488 00:14:20.162 }, 00:14:20.162 { 00:14:20.162 "name": "BaseBdev2", 00:14:20.162 "uuid": "9bb281e2-d725-4869-bcaa-58a9185e84f3", 00:14:20.162 "is_configured": true, 00:14:20.162 "data_offset": 2048, 00:14:20.162 "data_size": 63488 00:14:20.162 }, 00:14:20.162 { 00:14:20.162 "name": "BaseBdev3", 00:14:20.162 "uuid": "6e3f9ad9-4ba2-4250-a866-5ff360f2e205", 00:14:20.162 "is_configured": true, 00:14:20.162 "data_offset": 2048, 00:14:20.162 "data_size": 63488 00:14:20.162 } 00:14:20.162 ] 00:14:20.162 }' 00:14:20.162 20:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.162 20:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:20.730 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:20.730 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:20.730 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:20.730 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:20.730 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:20.730 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:20.730 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:20.730 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:20.990 [2024-07-15 20:28:13.290630] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:20.990 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:20.990 "name": "Existed_Raid", 00:14:20.990 "aliases": [ 00:14:20.990 "1f0bd309-3d6b-45ad-be40-09475f1030da" 00:14:20.990 ], 00:14:20.990 "product_name": "Raid Volume", 00:14:20.990 "block_size": 512, 00:14:20.990 "num_blocks": 190464, 00:14:20.990 "uuid": "1f0bd309-3d6b-45ad-be40-09475f1030da", 00:14:20.990 "assigned_rate_limits": { 00:14:20.990 "rw_ios_per_sec": 0, 00:14:20.990 "rw_mbytes_per_sec": 0, 00:14:20.990 "r_mbytes_per_sec": 0, 00:14:20.990 "w_mbytes_per_sec": 0 00:14:20.990 }, 00:14:20.990 "claimed": false, 00:14:20.990 "zoned": false, 00:14:20.990 "supported_io_types": { 00:14:20.990 "read": true, 00:14:20.990 "write": true, 00:14:20.990 "unmap": true, 00:14:20.990 "flush": true, 00:14:20.990 "reset": true, 00:14:20.990 "nvme_admin": false, 00:14:20.990 "nvme_io": false, 00:14:20.990 "nvme_io_md": false, 00:14:20.990 "write_zeroes": true, 00:14:20.990 "zcopy": false, 00:14:20.990 "get_zone_info": false, 00:14:20.990 "zone_management": false, 00:14:20.990 "zone_append": false, 00:14:20.990 "compare": false, 00:14:20.990 "compare_and_write": false, 00:14:20.990 "abort": false, 00:14:20.990 "seek_hole": false, 00:14:20.990 "seek_data": false, 00:14:20.990 "copy": false, 00:14:20.990 "nvme_iov_md": false 00:14:20.990 }, 00:14:20.990 "memory_domains": [ 00:14:20.990 { 00:14:20.990 "dma_device_id": "system", 00:14:20.990 "dma_device_type": 1 00:14:20.990 }, 00:14:20.990 { 00:14:20.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.990 "dma_device_type": 2 00:14:20.990 }, 00:14:20.990 { 00:14:20.990 "dma_device_id": "system", 00:14:20.990 "dma_device_type": 1 00:14:20.990 }, 00:14:20.990 { 00:14:20.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.990 "dma_device_type": 2 00:14:20.990 }, 00:14:20.990 { 00:14:20.990 "dma_device_id": "system", 00:14:20.990 "dma_device_type": 1 00:14:20.990 }, 00:14:20.990 { 00:14:20.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.990 "dma_device_type": 2 00:14:20.990 } 00:14:20.990 ], 00:14:20.990 "driver_specific": { 00:14:20.990 "raid": { 00:14:20.990 "uuid": "1f0bd309-3d6b-45ad-be40-09475f1030da", 00:14:20.990 "strip_size_kb": 64, 00:14:20.990 "state": "online", 00:14:20.990 "raid_level": "raid0", 00:14:20.990 "superblock": true, 00:14:20.990 "num_base_bdevs": 3, 00:14:20.990 "num_base_bdevs_discovered": 3, 00:14:20.990 "num_base_bdevs_operational": 3, 00:14:20.990 "base_bdevs_list": [ 00:14:20.990 { 00:14:20.990 "name": "BaseBdev1", 00:14:20.990 "uuid": "da62073c-dbad-43d9-8bc0-1212f89059e3", 00:14:20.990 "is_configured": true, 00:14:20.990 "data_offset": 2048, 00:14:20.990 "data_size": 63488 00:14:20.990 }, 00:14:20.990 { 00:14:20.990 "name": "BaseBdev2", 00:14:20.990 "uuid": "9bb281e2-d725-4869-bcaa-58a9185e84f3", 00:14:20.990 "is_configured": true, 00:14:20.990 "data_offset": 2048, 00:14:20.990 "data_size": 63488 00:14:20.990 }, 00:14:20.990 { 00:14:20.990 "name": "BaseBdev3", 00:14:20.990 "uuid": "6e3f9ad9-4ba2-4250-a866-5ff360f2e205", 00:14:20.990 "is_configured": true, 00:14:20.990 "data_offset": 2048, 00:14:20.990 "data_size": 63488 00:14:20.990 } 00:14:20.990 ] 00:14:20.990 } 00:14:20.990 } 00:14:20.990 }' 00:14:20.990 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:20.990 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:20.990 BaseBdev2 00:14:20.990 BaseBdev3' 00:14:20.990 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:20.990 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:20.990 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:21.249 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:21.249 "name": "BaseBdev1", 00:14:21.249 "aliases": [ 00:14:21.249 "da62073c-dbad-43d9-8bc0-1212f89059e3" 00:14:21.249 ], 00:14:21.249 "product_name": "Malloc disk", 00:14:21.249 "block_size": 512, 00:14:21.249 "num_blocks": 65536, 00:14:21.249 "uuid": "da62073c-dbad-43d9-8bc0-1212f89059e3", 00:14:21.249 "assigned_rate_limits": { 00:14:21.249 "rw_ios_per_sec": 0, 00:14:21.249 "rw_mbytes_per_sec": 0, 00:14:21.249 "r_mbytes_per_sec": 0, 00:14:21.249 "w_mbytes_per_sec": 0 00:14:21.249 }, 00:14:21.249 "claimed": true, 00:14:21.249 "claim_type": "exclusive_write", 00:14:21.249 "zoned": false, 00:14:21.249 "supported_io_types": { 00:14:21.249 "read": true, 00:14:21.249 "write": true, 00:14:21.249 "unmap": true, 00:14:21.249 "flush": true, 00:14:21.249 "reset": true, 00:14:21.249 "nvme_admin": false, 00:14:21.249 "nvme_io": false, 00:14:21.249 "nvme_io_md": false, 00:14:21.249 "write_zeroes": true, 00:14:21.249 "zcopy": true, 00:14:21.249 "get_zone_info": false, 00:14:21.249 "zone_management": false, 00:14:21.249 "zone_append": false, 00:14:21.249 "compare": false, 00:14:21.249 "compare_and_write": false, 00:14:21.249 "abort": true, 00:14:21.249 "seek_hole": false, 00:14:21.249 "seek_data": false, 00:14:21.249 "copy": true, 00:14:21.249 "nvme_iov_md": false 00:14:21.249 }, 00:14:21.249 "memory_domains": [ 00:14:21.249 { 00:14:21.249 "dma_device_id": "system", 00:14:21.249 "dma_device_type": 1 00:14:21.249 }, 00:14:21.250 { 00:14:21.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.250 "dma_device_type": 2 00:14:21.250 } 00:14:21.250 ], 00:14:21.250 "driver_specific": {} 00:14:21.250 }' 00:14:21.250 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.250 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:21.250 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:21.508 20:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:21.766 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:21.766 "name": "BaseBdev2", 00:14:21.766 "aliases": [ 00:14:21.766 "9bb281e2-d725-4869-bcaa-58a9185e84f3" 00:14:21.766 ], 00:14:21.766 "product_name": "Malloc disk", 00:14:21.766 "block_size": 512, 00:14:21.766 "num_blocks": 65536, 00:14:21.766 "uuid": "9bb281e2-d725-4869-bcaa-58a9185e84f3", 00:14:21.766 "assigned_rate_limits": { 00:14:21.766 "rw_ios_per_sec": 0, 00:14:21.766 "rw_mbytes_per_sec": 0, 00:14:21.766 "r_mbytes_per_sec": 0, 00:14:21.766 "w_mbytes_per_sec": 0 00:14:21.766 }, 00:14:21.766 "claimed": true, 00:14:21.766 "claim_type": "exclusive_write", 00:14:21.766 "zoned": false, 00:14:21.766 "supported_io_types": { 00:14:21.766 "read": true, 00:14:21.766 "write": true, 00:14:21.766 "unmap": true, 00:14:21.766 "flush": true, 00:14:21.766 "reset": true, 00:14:21.766 "nvme_admin": false, 00:14:21.766 "nvme_io": false, 00:14:21.766 "nvme_io_md": false, 00:14:21.766 "write_zeroes": true, 00:14:21.766 "zcopy": true, 00:14:21.766 "get_zone_info": false, 00:14:21.766 "zone_management": false, 00:14:21.766 "zone_append": false, 00:14:21.766 "compare": false, 00:14:21.766 "compare_and_write": false, 00:14:21.766 "abort": true, 00:14:21.766 "seek_hole": false, 00:14:21.766 "seek_data": false, 00:14:21.766 "copy": true, 00:14:21.766 "nvme_iov_md": false 00:14:21.766 }, 00:14:21.766 "memory_domains": [ 00:14:21.766 { 00:14:21.766 "dma_device_id": "system", 00:14:21.766 "dma_device_type": 1 00:14:21.766 }, 00:14:21.766 { 00:14:21.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.766 "dma_device_type": 2 00:14:21.766 } 00:14:21.766 ], 00:14:21.766 "driver_specific": {} 00:14:21.766 }' 00:14:21.766 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:22.025 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:22.025 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:22.025 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:22.025 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:22.025 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:22.025 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:22.025 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:22.025 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:22.025 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:22.283 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:22.283 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:22.283 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:22.283 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:22.283 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:22.283 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:22.283 "name": "BaseBdev3", 00:14:22.283 "aliases": [ 00:14:22.283 "6e3f9ad9-4ba2-4250-a866-5ff360f2e205" 00:14:22.283 ], 00:14:22.283 "product_name": "Malloc disk", 00:14:22.283 "block_size": 512, 00:14:22.283 "num_blocks": 65536, 00:14:22.283 "uuid": "6e3f9ad9-4ba2-4250-a866-5ff360f2e205", 00:14:22.283 "assigned_rate_limits": { 00:14:22.283 "rw_ios_per_sec": 0, 00:14:22.283 "rw_mbytes_per_sec": 0, 00:14:22.283 "r_mbytes_per_sec": 0, 00:14:22.283 "w_mbytes_per_sec": 0 00:14:22.283 }, 00:14:22.283 "claimed": true, 00:14:22.283 "claim_type": "exclusive_write", 00:14:22.283 "zoned": false, 00:14:22.283 "supported_io_types": { 00:14:22.283 "read": true, 00:14:22.283 "write": true, 00:14:22.283 "unmap": true, 00:14:22.283 "flush": true, 00:14:22.283 "reset": true, 00:14:22.283 "nvme_admin": false, 00:14:22.283 "nvme_io": false, 00:14:22.283 "nvme_io_md": false, 00:14:22.283 "write_zeroes": true, 00:14:22.283 "zcopy": true, 00:14:22.283 "get_zone_info": false, 00:14:22.283 "zone_management": false, 00:14:22.283 "zone_append": false, 00:14:22.283 "compare": false, 00:14:22.283 "compare_and_write": false, 00:14:22.283 "abort": true, 00:14:22.283 "seek_hole": false, 00:14:22.283 "seek_data": false, 00:14:22.283 "copy": true, 00:14:22.283 "nvme_iov_md": false 00:14:22.283 }, 00:14:22.283 "memory_domains": [ 00:14:22.283 { 00:14:22.283 "dma_device_id": "system", 00:14:22.283 "dma_device_type": 1 00:14:22.283 }, 00:14:22.283 { 00:14:22.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:22.283 "dma_device_type": 2 00:14:22.283 } 00:14:22.283 ], 00:14:22.284 "driver_specific": {} 00:14:22.284 }' 00:14:22.284 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:22.284 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:22.542 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:22.542 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:22.542 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:22.542 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:22.542 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:22.542 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:22.542 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:22.542 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:22.542 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:22.801 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:22.801 20:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:22.801 [2024-07-15 20:28:15.179391] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:22.801 [2024-07-15 20:28:15.179417] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:22.801 [2024-07-15 20:28:15.179458] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:23.060 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:23.061 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:23.061 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:23.061 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.061 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:23.321 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:23.321 "name": "Existed_Raid", 00:14:23.321 "uuid": "1f0bd309-3d6b-45ad-be40-09475f1030da", 00:14:23.321 "strip_size_kb": 64, 00:14:23.321 "state": "offline", 00:14:23.321 "raid_level": "raid0", 00:14:23.321 "superblock": true, 00:14:23.321 "num_base_bdevs": 3, 00:14:23.321 "num_base_bdevs_discovered": 2, 00:14:23.321 "num_base_bdevs_operational": 2, 00:14:23.321 "base_bdevs_list": [ 00:14:23.321 { 00:14:23.321 "name": null, 00:14:23.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:23.321 "is_configured": false, 00:14:23.321 "data_offset": 2048, 00:14:23.321 "data_size": 63488 00:14:23.321 }, 00:14:23.321 { 00:14:23.321 "name": "BaseBdev2", 00:14:23.321 "uuid": "9bb281e2-d725-4869-bcaa-58a9185e84f3", 00:14:23.321 "is_configured": true, 00:14:23.321 "data_offset": 2048, 00:14:23.321 "data_size": 63488 00:14:23.321 }, 00:14:23.321 { 00:14:23.321 "name": "BaseBdev3", 00:14:23.321 "uuid": "6e3f9ad9-4ba2-4250-a866-5ff360f2e205", 00:14:23.321 "is_configured": true, 00:14:23.321 "data_offset": 2048, 00:14:23.321 "data_size": 63488 00:14:23.321 } 00:14:23.321 ] 00:14:23.321 }' 00:14:23.321 20:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:23.321 20:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:23.889 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:23.889 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:23.889 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.889 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:24.148 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:24.148 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:24.148 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:24.407 [2024-07-15 20:28:16.556145] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:24.407 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:24.407 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:24.407 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.407 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:24.666 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:24.666 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:24.666 20:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:24.925 [2024-07-15 20:28:17.058031] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:24.925 [2024-07-15 20:28:17.058078] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1487400 name Existed_Raid, state offline 00:14:24.925 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:24.925 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:24.925 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.925 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:25.185 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:25.185 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:25.185 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:25.185 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:25.185 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:25.185 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:25.443 BaseBdev2 00:14:25.443 20:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:25.443 20:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:25.443 20:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:25.443 20:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:25.443 20:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:25.443 20:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:25.443 20:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:25.443 20:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:25.702 [ 00:14:25.702 { 00:14:25.702 "name": "BaseBdev2", 00:14:25.702 "aliases": [ 00:14:25.702 "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6" 00:14:25.702 ], 00:14:25.702 "product_name": "Malloc disk", 00:14:25.702 "block_size": 512, 00:14:25.702 "num_blocks": 65536, 00:14:25.702 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:25.702 "assigned_rate_limits": { 00:14:25.702 "rw_ios_per_sec": 0, 00:14:25.702 "rw_mbytes_per_sec": 0, 00:14:25.702 "r_mbytes_per_sec": 0, 00:14:25.702 "w_mbytes_per_sec": 0 00:14:25.702 }, 00:14:25.702 "claimed": false, 00:14:25.702 "zoned": false, 00:14:25.702 "supported_io_types": { 00:14:25.702 "read": true, 00:14:25.702 "write": true, 00:14:25.702 "unmap": true, 00:14:25.702 "flush": true, 00:14:25.702 "reset": true, 00:14:25.702 "nvme_admin": false, 00:14:25.702 "nvme_io": false, 00:14:25.702 "nvme_io_md": false, 00:14:25.702 "write_zeroes": true, 00:14:25.702 "zcopy": true, 00:14:25.702 "get_zone_info": false, 00:14:25.702 "zone_management": false, 00:14:25.702 "zone_append": false, 00:14:25.702 "compare": false, 00:14:25.702 "compare_and_write": false, 00:14:25.702 "abort": true, 00:14:25.702 "seek_hole": false, 00:14:25.702 "seek_data": false, 00:14:25.702 "copy": true, 00:14:25.702 "nvme_iov_md": false 00:14:25.702 }, 00:14:25.702 "memory_domains": [ 00:14:25.702 { 00:14:25.702 "dma_device_id": "system", 00:14:25.702 "dma_device_type": 1 00:14:25.702 }, 00:14:25.702 { 00:14:25.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:25.702 "dma_device_type": 2 00:14:25.702 } 00:14:25.702 ], 00:14:25.702 "driver_specific": {} 00:14:25.702 } 00:14:25.702 ] 00:14:25.702 20:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:25.702 20:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:25.702 20:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:25.702 20:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:25.961 BaseBdev3 00:14:25.961 20:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:25.961 20:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:25.961 20:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:25.961 20:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:25.961 20:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:25.961 20:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:25.961 20:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:26.221 20:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:26.480 [ 00:14:26.480 { 00:14:26.480 "name": "BaseBdev3", 00:14:26.480 "aliases": [ 00:14:26.480 "f6360a13-8e10-4cdc-828a-86f31d39af91" 00:14:26.480 ], 00:14:26.480 "product_name": "Malloc disk", 00:14:26.480 "block_size": 512, 00:14:26.480 "num_blocks": 65536, 00:14:26.480 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:26.480 "assigned_rate_limits": { 00:14:26.480 "rw_ios_per_sec": 0, 00:14:26.480 "rw_mbytes_per_sec": 0, 00:14:26.480 "r_mbytes_per_sec": 0, 00:14:26.480 "w_mbytes_per_sec": 0 00:14:26.480 }, 00:14:26.480 "claimed": false, 00:14:26.480 "zoned": false, 00:14:26.480 "supported_io_types": { 00:14:26.480 "read": true, 00:14:26.480 "write": true, 00:14:26.480 "unmap": true, 00:14:26.480 "flush": true, 00:14:26.480 "reset": true, 00:14:26.480 "nvme_admin": false, 00:14:26.480 "nvme_io": false, 00:14:26.480 "nvme_io_md": false, 00:14:26.480 "write_zeroes": true, 00:14:26.480 "zcopy": true, 00:14:26.480 "get_zone_info": false, 00:14:26.480 "zone_management": false, 00:14:26.480 "zone_append": false, 00:14:26.480 "compare": false, 00:14:26.480 "compare_and_write": false, 00:14:26.480 "abort": true, 00:14:26.480 "seek_hole": false, 00:14:26.480 "seek_data": false, 00:14:26.480 "copy": true, 00:14:26.480 "nvme_iov_md": false 00:14:26.480 }, 00:14:26.480 "memory_domains": [ 00:14:26.480 { 00:14:26.480 "dma_device_id": "system", 00:14:26.480 "dma_device_type": 1 00:14:26.480 }, 00:14:26.480 { 00:14:26.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.480 "dma_device_type": 2 00:14:26.480 } 00:14:26.480 ], 00:14:26.480 "driver_specific": {} 00:14:26.480 } 00:14:26.480 ] 00:14:26.480 20:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:26.480 20:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:26.480 20:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:26.480 20:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:26.739 [2024-07-15 20:28:19.037736] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:26.739 [2024-07-15 20:28:19.037776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:26.739 [2024-07-15 20:28:19.037793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:26.739 [2024-07-15 20:28:19.039132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.739 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:26.999 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.999 "name": "Existed_Raid", 00:14:26.999 "uuid": "87b84023-b888-4bee-bdc5-8b1f957e29c7", 00:14:26.999 "strip_size_kb": 64, 00:14:26.999 "state": "configuring", 00:14:26.999 "raid_level": "raid0", 00:14:26.999 "superblock": true, 00:14:26.999 "num_base_bdevs": 3, 00:14:26.999 "num_base_bdevs_discovered": 2, 00:14:26.999 "num_base_bdevs_operational": 3, 00:14:26.999 "base_bdevs_list": [ 00:14:26.999 { 00:14:26.999 "name": "BaseBdev1", 00:14:26.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:26.999 "is_configured": false, 00:14:26.999 "data_offset": 0, 00:14:26.999 "data_size": 0 00:14:26.999 }, 00:14:26.999 { 00:14:26.999 "name": "BaseBdev2", 00:14:26.999 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:26.999 "is_configured": true, 00:14:26.999 "data_offset": 2048, 00:14:26.999 "data_size": 63488 00:14:26.999 }, 00:14:26.999 { 00:14:26.999 "name": "BaseBdev3", 00:14:26.999 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:26.999 "is_configured": true, 00:14:26.999 "data_offset": 2048, 00:14:26.999 "data_size": 63488 00:14:26.999 } 00:14:26.999 ] 00:14:26.999 }' 00:14:26.999 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.999 20:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:27.567 20:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:27.826 [2024-07-15 20:28:20.144655] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.826 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:28.085 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.085 "name": "Existed_Raid", 00:14:28.085 "uuid": "87b84023-b888-4bee-bdc5-8b1f957e29c7", 00:14:28.085 "strip_size_kb": 64, 00:14:28.085 "state": "configuring", 00:14:28.085 "raid_level": "raid0", 00:14:28.085 "superblock": true, 00:14:28.085 "num_base_bdevs": 3, 00:14:28.085 "num_base_bdevs_discovered": 1, 00:14:28.085 "num_base_bdevs_operational": 3, 00:14:28.085 "base_bdevs_list": [ 00:14:28.085 { 00:14:28.085 "name": "BaseBdev1", 00:14:28.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:28.085 "is_configured": false, 00:14:28.085 "data_offset": 0, 00:14:28.085 "data_size": 0 00:14:28.085 }, 00:14:28.085 { 00:14:28.085 "name": null, 00:14:28.085 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:28.085 "is_configured": false, 00:14:28.085 "data_offset": 2048, 00:14:28.085 "data_size": 63488 00:14:28.085 }, 00:14:28.085 { 00:14:28.085 "name": "BaseBdev3", 00:14:28.085 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:28.085 "is_configured": true, 00:14:28.085 "data_offset": 2048, 00:14:28.085 "data_size": 63488 00:14:28.085 } 00:14:28.085 ] 00:14:28.085 }' 00:14:28.085 20:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.085 20:28:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:29.022 20:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.022 20:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:29.280 20:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:29.280 20:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:29.539 [2024-07-15 20:28:21.845720] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:29.539 BaseBdev1 00:14:29.539 20:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:29.539 20:28:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:29.539 20:28:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:29.539 20:28:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:29.539 20:28:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:29.539 20:28:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:29.539 20:28:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:29.797 20:28:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:30.056 [ 00:14:30.056 { 00:14:30.056 "name": "BaseBdev1", 00:14:30.056 "aliases": [ 00:14:30.056 "81963d16-88d3-44d5-a831-a9699e72ee06" 00:14:30.056 ], 00:14:30.056 "product_name": "Malloc disk", 00:14:30.056 "block_size": 512, 00:14:30.056 "num_blocks": 65536, 00:14:30.056 "uuid": "81963d16-88d3-44d5-a831-a9699e72ee06", 00:14:30.056 "assigned_rate_limits": { 00:14:30.056 "rw_ios_per_sec": 0, 00:14:30.056 "rw_mbytes_per_sec": 0, 00:14:30.056 "r_mbytes_per_sec": 0, 00:14:30.056 "w_mbytes_per_sec": 0 00:14:30.056 }, 00:14:30.056 "claimed": true, 00:14:30.056 "claim_type": "exclusive_write", 00:14:30.056 "zoned": false, 00:14:30.056 "supported_io_types": { 00:14:30.056 "read": true, 00:14:30.056 "write": true, 00:14:30.056 "unmap": true, 00:14:30.056 "flush": true, 00:14:30.056 "reset": true, 00:14:30.056 "nvme_admin": false, 00:14:30.056 "nvme_io": false, 00:14:30.056 "nvme_io_md": false, 00:14:30.056 "write_zeroes": true, 00:14:30.056 "zcopy": true, 00:14:30.056 "get_zone_info": false, 00:14:30.056 "zone_management": false, 00:14:30.056 "zone_append": false, 00:14:30.056 "compare": false, 00:14:30.056 "compare_and_write": false, 00:14:30.056 "abort": true, 00:14:30.056 "seek_hole": false, 00:14:30.056 "seek_data": false, 00:14:30.056 "copy": true, 00:14:30.056 "nvme_iov_md": false 00:14:30.056 }, 00:14:30.056 "memory_domains": [ 00:14:30.056 { 00:14:30.056 "dma_device_id": "system", 00:14:30.056 "dma_device_type": 1 00:14:30.056 }, 00:14:30.056 { 00:14:30.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:30.056 "dma_device_type": 2 00:14:30.056 } 00:14:30.056 ], 00:14:30.056 "driver_specific": {} 00:14:30.056 } 00:14:30.056 ] 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.056 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:30.315 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.315 "name": "Existed_Raid", 00:14:30.315 "uuid": "87b84023-b888-4bee-bdc5-8b1f957e29c7", 00:14:30.315 "strip_size_kb": 64, 00:14:30.315 "state": "configuring", 00:14:30.315 "raid_level": "raid0", 00:14:30.315 "superblock": true, 00:14:30.315 "num_base_bdevs": 3, 00:14:30.315 "num_base_bdevs_discovered": 2, 00:14:30.315 "num_base_bdevs_operational": 3, 00:14:30.315 "base_bdevs_list": [ 00:14:30.315 { 00:14:30.315 "name": "BaseBdev1", 00:14:30.315 "uuid": "81963d16-88d3-44d5-a831-a9699e72ee06", 00:14:30.315 "is_configured": true, 00:14:30.315 "data_offset": 2048, 00:14:30.315 "data_size": 63488 00:14:30.315 }, 00:14:30.315 { 00:14:30.315 "name": null, 00:14:30.315 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:30.315 "is_configured": false, 00:14:30.315 "data_offset": 2048, 00:14:30.315 "data_size": 63488 00:14:30.315 }, 00:14:30.315 { 00:14:30.315 "name": "BaseBdev3", 00:14:30.315 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:30.315 "is_configured": true, 00:14:30.315 "data_offset": 2048, 00:14:30.315 "data_size": 63488 00:14:30.315 } 00:14:30.315 ] 00:14:30.315 }' 00:14:30.315 20:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.315 20:28:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:31.298 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:31.298 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.555 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:31.555 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:31.814 [2024-07-15 20:28:23.951340] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.814 20:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:32.073 20:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.073 "name": "Existed_Raid", 00:14:32.073 "uuid": "87b84023-b888-4bee-bdc5-8b1f957e29c7", 00:14:32.073 "strip_size_kb": 64, 00:14:32.073 "state": "configuring", 00:14:32.073 "raid_level": "raid0", 00:14:32.073 "superblock": true, 00:14:32.073 "num_base_bdevs": 3, 00:14:32.073 "num_base_bdevs_discovered": 1, 00:14:32.073 "num_base_bdevs_operational": 3, 00:14:32.073 "base_bdevs_list": [ 00:14:32.073 { 00:14:32.073 "name": "BaseBdev1", 00:14:32.073 "uuid": "81963d16-88d3-44d5-a831-a9699e72ee06", 00:14:32.073 "is_configured": true, 00:14:32.073 "data_offset": 2048, 00:14:32.073 "data_size": 63488 00:14:32.073 }, 00:14:32.073 { 00:14:32.073 "name": null, 00:14:32.073 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:32.073 "is_configured": false, 00:14:32.073 "data_offset": 2048, 00:14:32.073 "data_size": 63488 00:14:32.073 }, 00:14:32.073 { 00:14:32.073 "name": null, 00:14:32.073 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:32.073 "is_configured": false, 00:14:32.073 "data_offset": 2048, 00:14:32.073 "data_size": 63488 00:14:32.073 } 00:14:32.073 ] 00:14:32.073 }' 00:14:32.073 20:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.073 20:28:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:32.640 20:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.640 20:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:33.207 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:33.207 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:33.465 [2024-07-15 20:28:25.643841] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:33.465 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.723 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.723 "name": "Existed_Raid", 00:14:33.723 "uuid": "87b84023-b888-4bee-bdc5-8b1f957e29c7", 00:14:33.723 "strip_size_kb": 64, 00:14:33.723 "state": "configuring", 00:14:33.723 "raid_level": "raid0", 00:14:33.723 "superblock": true, 00:14:33.723 "num_base_bdevs": 3, 00:14:33.723 "num_base_bdevs_discovered": 2, 00:14:33.723 "num_base_bdevs_operational": 3, 00:14:33.723 "base_bdevs_list": [ 00:14:33.723 { 00:14:33.723 "name": "BaseBdev1", 00:14:33.723 "uuid": "81963d16-88d3-44d5-a831-a9699e72ee06", 00:14:33.723 "is_configured": true, 00:14:33.723 "data_offset": 2048, 00:14:33.723 "data_size": 63488 00:14:33.723 }, 00:14:33.723 { 00:14:33.723 "name": null, 00:14:33.723 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:33.723 "is_configured": false, 00:14:33.723 "data_offset": 2048, 00:14:33.723 "data_size": 63488 00:14:33.723 }, 00:14:33.723 { 00:14:33.723 "name": "BaseBdev3", 00:14:33.723 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:33.723 "is_configured": true, 00:14:33.723 "data_offset": 2048, 00:14:33.723 "data_size": 63488 00:14:33.723 } 00:14:33.723 ] 00:14:33.723 }' 00:14:33.723 20:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.723 20:28:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:34.656 20:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.656 20:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:34.656 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:34.656 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:34.916 [2024-07-15 20:28:27.236098] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.916 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.175 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.175 "name": "Existed_Raid", 00:14:35.175 "uuid": "87b84023-b888-4bee-bdc5-8b1f957e29c7", 00:14:35.175 "strip_size_kb": 64, 00:14:35.175 "state": "configuring", 00:14:35.175 "raid_level": "raid0", 00:14:35.175 "superblock": true, 00:14:35.175 "num_base_bdevs": 3, 00:14:35.175 "num_base_bdevs_discovered": 1, 00:14:35.175 "num_base_bdevs_operational": 3, 00:14:35.175 "base_bdevs_list": [ 00:14:35.175 { 00:14:35.175 "name": null, 00:14:35.175 "uuid": "81963d16-88d3-44d5-a831-a9699e72ee06", 00:14:35.175 "is_configured": false, 00:14:35.175 "data_offset": 2048, 00:14:35.175 "data_size": 63488 00:14:35.175 }, 00:14:35.175 { 00:14:35.175 "name": null, 00:14:35.175 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:35.175 "is_configured": false, 00:14:35.175 "data_offset": 2048, 00:14:35.175 "data_size": 63488 00:14:35.175 }, 00:14:35.175 { 00:14:35.175 "name": "BaseBdev3", 00:14:35.175 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:35.175 "is_configured": true, 00:14:35.175 "data_offset": 2048, 00:14:35.175 "data_size": 63488 00:14:35.175 } 00:14:35.175 ] 00:14:35.175 }' 00:14:35.175 20:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.175 20:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:35.743 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.743 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:36.002 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:36.002 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:36.571 [2024-07-15 20:28:28.835123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.571 20:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.830 20:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.830 "name": "Existed_Raid", 00:14:36.830 "uuid": "87b84023-b888-4bee-bdc5-8b1f957e29c7", 00:14:36.830 "strip_size_kb": 64, 00:14:36.830 "state": "configuring", 00:14:36.830 "raid_level": "raid0", 00:14:36.830 "superblock": true, 00:14:36.830 "num_base_bdevs": 3, 00:14:36.830 "num_base_bdevs_discovered": 2, 00:14:36.830 "num_base_bdevs_operational": 3, 00:14:36.830 "base_bdevs_list": [ 00:14:36.830 { 00:14:36.830 "name": null, 00:14:36.830 "uuid": "81963d16-88d3-44d5-a831-a9699e72ee06", 00:14:36.830 "is_configured": false, 00:14:36.830 "data_offset": 2048, 00:14:36.830 "data_size": 63488 00:14:36.830 }, 00:14:36.830 { 00:14:36.830 "name": "BaseBdev2", 00:14:36.830 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:36.830 "is_configured": true, 00:14:36.830 "data_offset": 2048, 00:14:36.830 "data_size": 63488 00:14:36.830 }, 00:14:36.830 { 00:14:36.830 "name": "BaseBdev3", 00:14:36.830 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:36.830 "is_configured": true, 00:14:36.830 "data_offset": 2048, 00:14:36.830 "data_size": 63488 00:14:36.830 } 00:14:36.830 ] 00:14:36.830 }' 00:14:36.830 20:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.830 20:28:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:37.767 20:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.767 20:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:38.025 20:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:38.025 20:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.025 20:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:38.284 20:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 81963d16-88d3-44d5-a831-a9699e72ee06 00:14:38.543 [2024-07-15 20:28:30.907938] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:38.543 [2024-07-15 20:28:30.908085] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1485e90 00:14:38.543 [2024-07-15 20:28:30.908098] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:38.543 [2024-07-15 20:28:30.908273] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x118c940 00:14:38.543 [2024-07-15 20:28:30.908385] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1485e90 00:14:38.543 [2024-07-15 20:28:30.908395] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1485e90 00:14:38.543 [2024-07-15 20:28:30.908485] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:38.543 NewBaseBdev 00:14:38.802 20:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:38.802 20:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:38.802 20:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:38.802 20:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:38.802 20:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:38.802 20:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:38.802 20:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.060 20:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:39.627 [ 00:14:39.627 { 00:14:39.627 "name": "NewBaseBdev", 00:14:39.627 "aliases": [ 00:14:39.627 "81963d16-88d3-44d5-a831-a9699e72ee06" 00:14:39.627 ], 00:14:39.627 "product_name": "Malloc disk", 00:14:39.627 "block_size": 512, 00:14:39.627 "num_blocks": 65536, 00:14:39.627 "uuid": "81963d16-88d3-44d5-a831-a9699e72ee06", 00:14:39.627 "assigned_rate_limits": { 00:14:39.627 "rw_ios_per_sec": 0, 00:14:39.627 "rw_mbytes_per_sec": 0, 00:14:39.627 "r_mbytes_per_sec": 0, 00:14:39.627 "w_mbytes_per_sec": 0 00:14:39.627 }, 00:14:39.627 "claimed": true, 00:14:39.627 "claim_type": "exclusive_write", 00:14:39.627 "zoned": false, 00:14:39.627 "supported_io_types": { 00:14:39.627 "read": true, 00:14:39.627 "write": true, 00:14:39.627 "unmap": true, 00:14:39.627 "flush": true, 00:14:39.627 "reset": true, 00:14:39.627 "nvme_admin": false, 00:14:39.627 "nvme_io": false, 00:14:39.627 "nvme_io_md": false, 00:14:39.627 "write_zeroes": true, 00:14:39.627 "zcopy": true, 00:14:39.627 "get_zone_info": false, 00:14:39.627 "zone_management": false, 00:14:39.627 "zone_append": false, 00:14:39.627 "compare": false, 00:14:39.627 "compare_and_write": false, 00:14:39.627 "abort": true, 00:14:39.627 "seek_hole": false, 00:14:39.627 "seek_data": false, 00:14:39.627 "copy": true, 00:14:39.627 "nvme_iov_md": false 00:14:39.627 }, 00:14:39.627 "memory_domains": [ 00:14:39.627 { 00:14:39.627 "dma_device_id": "system", 00:14:39.627 "dma_device_type": 1 00:14:39.627 }, 00:14:39.627 { 00:14:39.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.627 "dma_device_type": 2 00:14:39.627 } 00:14:39.627 ], 00:14:39.627 "driver_specific": {} 00:14:39.627 } 00:14:39.627 ] 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.627 20:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.193 20:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.193 "name": "Existed_Raid", 00:14:40.193 "uuid": "87b84023-b888-4bee-bdc5-8b1f957e29c7", 00:14:40.193 "strip_size_kb": 64, 00:14:40.193 "state": "online", 00:14:40.193 "raid_level": "raid0", 00:14:40.193 "superblock": true, 00:14:40.193 "num_base_bdevs": 3, 00:14:40.193 "num_base_bdevs_discovered": 3, 00:14:40.193 "num_base_bdevs_operational": 3, 00:14:40.193 "base_bdevs_list": [ 00:14:40.193 { 00:14:40.193 "name": "NewBaseBdev", 00:14:40.193 "uuid": "81963d16-88d3-44d5-a831-a9699e72ee06", 00:14:40.193 "is_configured": true, 00:14:40.193 "data_offset": 2048, 00:14:40.193 "data_size": 63488 00:14:40.193 }, 00:14:40.193 { 00:14:40.193 "name": "BaseBdev2", 00:14:40.193 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:40.193 "is_configured": true, 00:14:40.193 "data_offset": 2048, 00:14:40.193 "data_size": 63488 00:14:40.193 }, 00:14:40.193 { 00:14:40.193 "name": "BaseBdev3", 00:14:40.193 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:40.193 "is_configured": true, 00:14:40.193 "data_offset": 2048, 00:14:40.193 "data_size": 63488 00:14:40.193 } 00:14:40.193 ] 00:14:40.193 }' 00:14:40.193 20:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.193 20:28:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:40.758 20:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:40.758 20:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:40.758 20:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:40.758 20:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:40.758 20:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:40.758 20:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:40.758 20:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:40.758 20:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:40.758 [2024-07-15 20:28:33.098059] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:40.758 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:40.758 "name": "Existed_Raid", 00:14:40.758 "aliases": [ 00:14:40.758 "87b84023-b888-4bee-bdc5-8b1f957e29c7" 00:14:40.758 ], 00:14:40.758 "product_name": "Raid Volume", 00:14:40.758 "block_size": 512, 00:14:40.758 "num_blocks": 190464, 00:14:40.758 "uuid": "87b84023-b888-4bee-bdc5-8b1f957e29c7", 00:14:40.758 "assigned_rate_limits": { 00:14:40.758 "rw_ios_per_sec": 0, 00:14:40.758 "rw_mbytes_per_sec": 0, 00:14:40.758 "r_mbytes_per_sec": 0, 00:14:40.758 "w_mbytes_per_sec": 0 00:14:40.758 }, 00:14:40.758 "claimed": false, 00:14:40.758 "zoned": false, 00:14:40.758 "supported_io_types": { 00:14:40.758 "read": true, 00:14:40.758 "write": true, 00:14:40.758 "unmap": true, 00:14:40.758 "flush": true, 00:14:40.758 "reset": true, 00:14:40.758 "nvme_admin": false, 00:14:40.758 "nvme_io": false, 00:14:40.758 "nvme_io_md": false, 00:14:40.758 "write_zeroes": true, 00:14:40.758 "zcopy": false, 00:14:40.758 "get_zone_info": false, 00:14:40.758 "zone_management": false, 00:14:40.758 "zone_append": false, 00:14:40.758 "compare": false, 00:14:40.758 "compare_and_write": false, 00:14:40.758 "abort": false, 00:14:40.758 "seek_hole": false, 00:14:40.758 "seek_data": false, 00:14:40.758 "copy": false, 00:14:40.758 "nvme_iov_md": false 00:14:40.758 }, 00:14:40.758 "memory_domains": [ 00:14:40.758 { 00:14:40.758 "dma_device_id": "system", 00:14:40.758 "dma_device_type": 1 00:14:40.758 }, 00:14:40.758 { 00:14:40.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.758 "dma_device_type": 2 00:14:40.758 }, 00:14:40.758 { 00:14:40.758 "dma_device_id": "system", 00:14:40.758 "dma_device_type": 1 00:14:40.758 }, 00:14:40.758 { 00:14:40.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.758 "dma_device_type": 2 00:14:40.758 }, 00:14:40.758 { 00:14:40.758 "dma_device_id": "system", 00:14:40.758 "dma_device_type": 1 00:14:40.758 }, 00:14:40.758 { 00:14:40.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.758 "dma_device_type": 2 00:14:40.758 } 00:14:40.758 ], 00:14:40.758 "driver_specific": { 00:14:40.758 "raid": { 00:14:40.758 "uuid": "87b84023-b888-4bee-bdc5-8b1f957e29c7", 00:14:40.758 "strip_size_kb": 64, 00:14:40.758 "state": "online", 00:14:40.758 "raid_level": "raid0", 00:14:40.758 "superblock": true, 00:14:40.758 "num_base_bdevs": 3, 00:14:40.758 "num_base_bdevs_discovered": 3, 00:14:40.758 "num_base_bdevs_operational": 3, 00:14:40.758 "base_bdevs_list": [ 00:14:40.758 { 00:14:40.758 "name": "NewBaseBdev", 00:14:40.758 "uuid": "81963d16-88d3-44d5-a831-a9699e72ee06", 00:14:40.758 "is_configured": true, 00:14:40.758 "data_offset": 2048, 00:14:40.758 "data_size": 63488 00:14:40.758 }, 00:14:40.758 { 00:14:40.758 "name": "BaseBdev2", 00:14:40.758 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:40.758 "is_configured": true, 00:14:40.758 "data_offset": 2048, 00:14:40.758 "data_size": 63488 00:14:40.758 }, 00:14:40.758 { 00:14:40.758 "name": "BaseBdev3", 00:14:40.758 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:40.758 "is_configured": true, 00:14:40.758 "data_offset": 2048, 00:14:40.758 "data_size": 63488 00:14:40.758 } 00:14:40.758 ] 00:14:40.758 } 00:14:40.758 } 00:14:40.758 }' 00:14:40.758 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:41.022 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:41.022 BaseBdev2 00:14:41.022 BaseBdev3' 00:14:41.022 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.022 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:41.022 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.283 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.283 "name": "NewBaseBdev", 00:14:41.283 "aliases": [ 00:14:41.283 "81963d16-88d3-44d5-a831-a9699e72ee06" 00:14:41.283 ], 00:14:41.283 "product_name": "Malloc disk", 00:14:41.283 "block_size": 512, 00:14:41.283 "num_blocks": 65536, 00:14:41.283 "uuid": "81963d16-88d3-44d5-a831-a9699e72ee06", 00:14:41.283 "assigned_rate_limits": { 00:14:41.283 "rw_ios_per_sec": 0, 00:14:41.283 "rw_mbytes_per_sec": 0, 00:14:41.283 "r_mbytes_per_sec": 0, 00:14:41.283 "w_mbytes_per_sec": 0 00:14:41.283 }, 00:14:41.283 "claimed": true, 00:14:41.283 "claim_type": "exclusive_write", 00:14:41.283 "zoned": false, 00:14:41.283 "supported_io_types": { 00:14:41.283 "read": true, 00:14:41.283 "write": true, 00:14:41.283 "unmap": true, 00:14:41.283 "flush": true, 00:14:41.283 "reset": true, 00:14:41.283 "nvme_admin": false, 00:14:41.283 "nvme_io": false, 00:14:41.283 "nvme_io_md": false, 00:14:41.283 "write_zeroes": true, 00:14:41.283 "zcopy": true, 00:14:41.283 "get_zone_info": false, 00:14:41.283 "zone_management": false, 00:14:41.283 "zone_append": false, 00:14:41.283 "compare": false, 00:14:41.283 "compare_and_write": false, 00:14:41.283 "abort": true, 00:14:41.283 "seek_hole": false, 00:14:41.283 "seek_data": false, 00:14:41.283 "copy": true, 00:14:41.283 "nvme_iov_md": false 00:14:41.283 }, 00:14:41.283 "memory_domains": [ 00:14:41.283 { 00:14:41.283 "dma_device_id": "system", 00:14:41.283 "dma_device_type": 1 00:14:41.283 }, 00:14:41.283 { 00:14:41.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.283 "dma_device_type": 2 00:14:41.283 } 00:14:41.283 ], 00:14:41.283 "driver_specific": {} 00:14:41.283 }' 00:14:41.283 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.283 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.283 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.283 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.283 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.283 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.283 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.283 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.542 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.542 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.542 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.542 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.542 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.542 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:41.542 20:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.801 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.801 "name": "BaseBdev2", 00:14:41.801 "aliases": [ 00:14:41.801 "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6" 00:14:41.801 ], 00:14:41.801 "product_name": "Malloc disk", 00:14:41.801 "block_size": 512, 00:14:41.801 "num_blocks": 65536, 00:14:41.801 "uuid": "6ef408ac-8c1e-436a-b11a-b1aff5d4e3e6", 00:14:41.801 "assigned_rate_limits": { 00:14:41.801 "rw_ios_per_sec": 0, 00:14:41.801 "rw_mbytes_per_sec": 0, 00:14:41.801 "r_mbytes_per_sec": 0, 00:14:41.801 "w_mbytes_per_sec": 0 00:14:41.801 }, 00:14:41.801 "claimed": true, 00:14:41.801 "claim_type": "exclusive_write", 00:14:41.801 "zoned": false, 00:14:41.801 "supported_io_types": { 00:14:41.801 "read": true, 00:14:41.801 "write": true, 00:14:41.801 "unmap": true, 00:14:41.801 "flush": true, 00:14:41.801 "reset": true, 00:14:41.801 "nvme_admin": false, 00:14:41.801 "nvme_io": false, 00:14:41.801 "nvme_io_md": false, 00:14:41.801 "write_zeroes": true, 00:14:41.801 "zcopy": true, 00:14:41.801 "get_zone_info": false, 00:14:41.801 "zone_management": false, 00:14:41.801 "zone_append": false, 00:14:41.801 "compare": false, 00:14:41.801 "compare_and_write": false, 00:14:41.801 "abort": true, 00:14:41.801 "seek_hole": false, 00:14:41.801 "seek_data": false, 00:14:41.801 "copy": true, 00:14:41.801 "nvme_iov_md": false 00:14:41.801 }, 00:14:41.801 "memory_domains": [ 00:14:41.801 { 00:14:41.801 "dma_device_id": "system", 00:14:41.801 "dma_device_type": 1 00:14:41.801 }, 00:14:41.801 { 00:14:41.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.801 "dma_device_type": 2 00:14:41.801 } 00:14:41.801 ], 00:14:41.801 "driver_specific": {} 00:14:41.801 }' 00:14:41.801 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.801 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.801 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.801 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.801 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.801 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.801 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.059 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.059 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:42.059 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.059 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.059 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:42.059 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:42.059 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:42.059 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:42.317 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:42.317 "name": "BaseBdev3", 00:14:42.317 "aliases": [ 00:14:42.317 "f6360a13-8e10-4cdc-828a-86f31d39af91" 00:14:42.317 ], 00:14:42.317 "product_name": "Malloc disk", 00:14:42.317 "block_size": 512, 00:14:42.317 "num_blocks": 65536, 00:14:42.317 "uuid": "f6360a13-8e10-4cdc-828a-86f31d39af91", 00:14:42.317 "assigned_rate_limits": { 00:14:42.317 "rw_ios_per_sec": 0, 00:14:42.317 "rw_mbytes_per_sec": 0, 00:14:42.317 "r_mbytes_per_sec": 0, 00:14:42.317 "w_mbytes_per_sec": 0 00:14:42.317 }, 00:14:42.317 "claimed": true, 00:14:42.317 "claim_type": "exclusive_write", 00:14:42.317 "zoned": false, 00:14:42.317 "supported_io_types": { 00:14:42.317 "read": true, 00:14:42.317 "write": true, 00:14:42.317 "unmap": true, 00:14:42.317 "flush": true, 00:14:42.317 "reset": true, 00:14:42.317 "nvme_admin": false, 00:14:42.317 "nvme_io": false, 00:14:42.317 "nvme_io_md": false, 00:14:42.317 "write_zeroes": true, 00:14:42.317 "zcopy": true, 00:14:42.317 "get_zone_info": false, 00:14:42.317 "zone_management": false, 00:14:42.317 "zone_append": false, 00:14:42.317 "compare": false, 00:14:42.317 "compare_and_write": false, 00:14:42.317 "abort": true, 00:14:42.317 "seek_hole": false, 00:14:42.317 "seek_data": false, 00:14:42.317 "copy": true, 00:14:42.317 "nvme_iov_md": false 00:14:42.317 }, 00:14:42.318 "memory_domains": [ 00:14:42.318 { 00:14:42.318 "dma_device_id": "system", 00:14:42.318 "dma_device_type": 1 00:14:42.318 }, 00:14:42.318 { 00:14:42.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.318 "dma_device_type": 2 00:14:42.318 } 00:14:42.318 ], 00:14:42.318 "driver_specific": {} 00:14:42.318 }' 00:14:42.318 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.318 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.575 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:42.575 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.575 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.575 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:42.575 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.575 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.575 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:42.575 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.575 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:42.834 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:42.834 20:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:42.834 [2024-07-15 20:28:35.187297] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:42.834 [2024-07-15 20:28:35.187325] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:42.834 [2024-07-15 20:28:35.187384] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:42.834 [2024-07-15 20:28:35.187437] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:42.834 [2024-07-15 20:28:35.187449] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1485e90 name Existed_Raid, state offline 00:14:42.834 20:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1374920 00:14:42.834 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1374920 ']' 00:14:42.834 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1374920 00:14:42.834 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:42.834 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:43.092 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1374920 00:14:43.093 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:43.093 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:43.093 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1374920' 00:14:43.093 killing process with pid 1374920 00:14:43.093 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1374920 00:14:43.093 [2024-07-15 20:28:35.256408] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:43.093 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1374920 00:14:43.093 [2024-07-15 20:28:35.286984] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:43.352 20:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:43.352 00:14:43.352 real 0m30.951s 00:14:43.352 user 0m57.012s 00:14:43.352 sys 0m5.403s 00:14:43.352 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:43.352 20:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:43.352 ************************************ 00:14:43.352 END TEST raid_state_function_test_sb 00:14:43.352 ************************************ 00:14:43.352 20:28:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:43.352 20:28:35 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:43.352 20:28:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:43.352 20:28:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:43.352 20:28:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:43.352 ************************************ 00:14:43.352 START TEST raid_superblock_test 00:14:43.352 ************************************ 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1379489 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1379489 /var/tmp/spdk-raid.sock 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1379489 ']' 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:43.352 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:43.352 20:28:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.352 [2024-07-15 20:28:35.659843] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:14:43.352 [2024-07-15 20:28:35.659912] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1379489 ] 00:14:43.610 [2024-07-15 20:28:35.791408] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:43.610 [2024-07-15 20:28:35.892986] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:43.610 [2024-07-15 20:28:35.951194] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:43.610 [2024-07-15 20:28:35.951231] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:44.545 malloc1 00:14:44.545 20:28:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:44.803 [2024-07-15 20:28:37.070733] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:44.804 [2024-07-15 20:28:37.070785] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:44.804 [2024-07-15 20:28:37.070805] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed8570 00:14:44.804 [2024-07-15 20:28:37.070817] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:44.804 [2024-07-15 20:28:37.072410] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:44.804 [2024-07-15 20:28:37.072437] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:44.804 pt1 00:14:44.804 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:44.804 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:44.804 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:44.804 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:44.804 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:44.804 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:44.804 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:44.804 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:44.804 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:45.062 malloc2 00:14:45.062 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:45.321 [2024-07-15 20:28:37.584984] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:45.321 [2024-07-15 20:28:37.585036] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:45.322 [2024-07-15 20:28:37.585060] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed9970 00:14:45.322 [2024-07-15 20:28:37.585073] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:45.322 [2024-07-15 20:28:37.586611] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:45.322 [2024-07-15 20:28:37.586639] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:45.322 pt2 00:14:45.322 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:45.322 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:45.322 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:45.322 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:45.322 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:45.322 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:45.322 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:45.322 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:45.322 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:45.580 malloc3 00:14:45.580 20:28:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:45.839 [2024-07-15 20:28:38.091039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:45.839 [2024-07-15 20:28:38.091088] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:45.839 [2024-07-15 20:28:38.091106] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1070340 00:14:45.839 [2024-07-15 20:28:38.091119] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:45.839 [2024-07-15 20:28:38.092523] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:45.839 [2024-07-15 20:28:38.092551] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:45.839 pt3 00:14:45.839 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:45.839 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:45.839 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:46.097 [2024-07-15 20:28:38.343725] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:46.097 [2024-07-15 20:28:38.345012] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:46.097 [2024-07-15 20:28:38.345068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:46.097 [2024-07-15 20:28:38.345221] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xed0ea0 00:14:46.097 [2024-07-15 20:28:38.345232] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:46.097 [2024-07-15 20:28:38.345431] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xed8240 00:14:46.097 [2024-07-15 20:28:38.345572] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xed0ea0 00:14:46.097 [2024-07-15 20:28:38.345582] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xed0ea0 00:14:46.097 [2024-07-15 20:28:38.345680] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:46.097 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:46.097 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:46.097 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:46.097 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:46.097 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.097 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.098 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.098 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.098 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.098 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.098 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.098 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:46.357 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.357 "name": "raid_bdev1", 00:14:46.357 "uuid": "3caf4869-bdfe-4560-a908-a381f7b4516d", 00:14:46.357 "strip_size_kb": 64, 00:14:46.357 "state": "online", 00:14:46.357 "raid_level": "raid0", 00:14:46.357 "superblock": true, 00:14:46.357 "num_base_bdevs": 3, 00:14:46.357 "num_base_bdevs_discovered": 3, 00:14:46.357 "num_base_bdevs_operational": 3, 00:14:46.357 "base_bdevs_list": [ 00:14:46.357 { 00:14:46.357 "name": "pt1", 00:14:46.357 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:46.357 "is_configured": true, 00:14:46.357 "data_offset": 2048, 00:14:46.357 "data_size": 63488 00:14:46.357 }, 00:14:46.357 { 00:14:46.357 "name": "pt2", 00:14:46.357 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:46.357 "is_configured": true, 00:14:46.357 "data_offset": 2048, 00:14:46.357 "data_size": 63488 00:14:46.357 }, 00:14:46.357 { 00:14:46.357 "name": "pt3", 00:14:46.357 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:46.357 "is_configured": true, 00:14:46.357 "data_offset": 2048, 00:14:46.357 "data_size": 63488 00:14:46.357 } 00:14:46.357 ] 00:14:46.357 }' 00:14:46.357 20:28:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.357 20:28:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.290 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:47.290 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:47.290 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:47.290 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:47.290 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:47.290 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:47.290 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:47.290 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:47.290 [2024-07-15 20:28:39.531147] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:47.290 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:47.290 "name": "raid_bdev1", 00:14:47.290 "aliases": [ 00:14:47.290 "3caf4869-bdfe-4560-a908-a381f7b4516d" 00:14:47.290 ], 00:14:47.290 "product_name": "Raid Volume", 00:14:47.290 "block_size": 512, 00:14:47.290 "num_blocks": 190464, 00:14:47.290 "uuid": "3caf4869-bdfe-4560-a908-a381f7b4516d", 00:14:47.290 "assigned_rate_limits": { 00:14:47.290 "rw_ios_per_sec": 0, 00:14:47.290 "rw_mbytes_per_sec": 0, 00:14:47.290 "r_mbytes_per_sec": 0, 00:14:47.290 "w_mbytes_per_sec": 0 00:14:47.290 }, 00:14:47.290 "claimed": false, 00:14:47.290 "zoned": false, 00:14:47.290 "supported_io_types": { 00:14:47.290 "read": true, 00:14:47.290 "write": true, 00:14:47.290 "unmap": true, 00:14:47.290 "flush": true, 00:14:47.290 "reset": true, 00:14:47.290 "nvme_admin": false, 00:14:47.290 "nvme_io": false, 00:14:47.290 "nvme_io_md": false, 00:14:47.290 "write_zeroes": true, 00:14:47.290 "zcopy": false, 00:14:47.290 "get_zone_info": false, 00:14:47.290 "zone_management": false, 00:14:47.290 "zone_append": false, 00:14:47.290 "compare": false, 00:14:47.290 "compare_and_write": false, 00:14:47.290 "abort": false, 00:14:47.290 "seek_hole": false, 00:14:47.290 "seek_data": false, 00:14:47.290 "copy": false, 00:14:47.290 "nvme_iov_md": false 00:14:47.290 }, 00:14:47.290 "memory_domains": [ 00:14:47.290 { 00:14:47.290 "dma_device_id": "system", 00:14:47.290 "dma_device_type": 1 00:14:47.290 }, 00:14:47.290 { 00:14:47.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.290 "dma_device_type": 2 00:14:47.290 }, 00:14:47.290 { 00:14:47.290 "dma_device_id": "system", 00:14:47.290 "dma_device_type": 1 00:14:47.290 }, 00:14:47.290 { 00:14:47.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.290 "dma_device_type": 2 00:14:47.290 }, 00:14:47.290 { 00:14:47.290 "dma_device_id": "system", 00:14:47.290 "dma_device_type": 1 00:14:47.290 }, 00:14:47.290 { 00:14:47.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.290 "dma_device_type": 2 00:14:47.290 } 00:14:47.290 ], 00:14:47.290 "driver_specific": { 00:14:47.290 "raid": { 00:14:47.290 "uuid": "3caf4869-bdfe-4560-a908-a381f7b4516d", 00:14:47.290 "strip_size_kb": 64, 00:14:47.290 "state": "online", 00:14:47.290 "raid_level": "raid0", 00:14:47.290 "superblock": true, 00:14:47.290 "num_base_bdevs": 3, 00:14:47.290 "num_base_bdevs_discovered": 3, 00:14:47.290 "num_base_bdevs_operational": 3, 00:14:47.290 "base_bdevs_list": [ 00:14:47.290 { 00:14:47.290 "name": "pt1", 00:14:47.290 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:47.290 "is_configured": true, 00:14:47.290 "data_offset": 2048, 00:14:47.290 "data_size": 63488 00:14:47.290 }, 00:14:47.290 { 00:14:47.290 "name": "pt2", 00:14:47.290 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:47.290 "is_configured": true, 00:14:47.290 "data_offset": 2048, 00:14:47.290 "data_size": 63488 00:14:47.290 }, 00:14:47.290 { 00:14:47.291 "name": "pt3", 00:14:47.291 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:47.291 "is_configured": true, 00:14:47.291 "data_offset": 2048, 00:14:47.291 "data_size": 63488 00:14:47.291 } 00:14:47.291 ] 00:14:47.291 } 00:14:47.291 } 00:14:47.291 }' 00:14:47.291 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:47.291 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:47.291 pt2 00:14:47.291 pt3' 00:14:47.291 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:47.291 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:47.291 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:47.548 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:47.548 "name": "pt1", 00:14:47.548 "aliases": [ 00:14:47.548 "00000000-0000-0000-0000-000000000001" 00:14:47.548 ], 00:14:47.548 "product_name": "passthru", 00:14:47.548 "block_size": 512, 00:14:47.548 "num_blocks": 65536, 00:14:47.548 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:47.548 "assigned_rate_limits": { 00:14:47.548 "rw_ios_per_sec": 0, 00:14:47.548 "rw_mbytes_per_sec": 0, 00:14:47.548 "r_mbytes_per_sec": 0, 00:14:47.548 "w_mbytes_per_sec": 0 00:14:47.548 }, 00:14:47.548 "claimed": true, 00:14:47.548 "claim_type": "exclusive_write", 00:14:47.548 "zoned": false, 00:14:47.548 "supported_io_types": { 00:14:47.548 "read": true, 00:14:47.548 "write": true, 00:14:47.548 "unmap": true, 00:14:47.548 "flush": true, 00:14:47.548 "reset": true, 00:14:47.548 "nvme_admin": false, 00:14:47.548 "nvme_io": false, 00:14:47.548 "nvme_io_md": false, 00:14:47.549 "write_zeroes": true, 00:14:47.549 "zcopy": true, 00:14:47.549 "get_zone_info": false, 00:14:47.549 "zone_management": false, 00:14:47.549 "zone_append": false, 00:14:47.549 "compare": false, 00:14:47.549 "compare_and_write": false, 00:14:47.549 "abort": true, 00:14:47.549 "seek_hole": false, 00:14:47.549 "seek_data": false, 00:14:47.549 "copy": true, 00:14:47.549 "nvme_iov_md": false 00:14:47.549 }, 00:14:47.549 "memory_domains": [ 00:14:47.549 { 00:14:47.549 "dma_device_id": "system", 00:14:47.549 "dma_device_type": 1 00:14:47.549 }, 00:14:47.549 { 00:14:47.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.549 "dma_device_type": 2 00:14:47.549 } 00:14:47.549 ], 00:14:47.549 "driver_specific": { 00:14:47.549 "passthru": { 00:14:47.549 "name": "pt1", 00:14:47.549 "base_bdev_name": "malloc1" 00:14:47.549 } 00:14:47.549 } 00:14:47.549 }' 00:14:47.549 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.806 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.806 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:47.806 20:28:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:47.806 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:47.806 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:47.806 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:47.806 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.064 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:48.064 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.064 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.064 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:48.064 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:48.064 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:48.064 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:48.360 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:48.360 "name": "pt2", 00:14:48.360 "aliases": [ 00:14:48.360 "00000000-0000-0000-0000-000000000002" 00:14:48.360 ], 00:14:48.360 "product_name": "passthru", 00:14:48.360 "block_size": 512, 00:14:48.360 "num_blocks": 65536, 00:14:48.360 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:48.360 "assigned_rate_limits": { 00:14:48.360 "rw_ios_per_sec": 0, 00:14:48.360 "rw_mbytes_per_sec": 0, 00:14:48.360 "r_mbytes_per_sec": 0, 00:14:48.360 "w_mbytes_per_sec": 0 00:14:48.360 }, 00:14:48.360 "claimed": true, 00:14:48.360 "claim_type": "exclusive_write", 00:14:48.361 "zoned": false, 00:14:48.361 "supported_io_types": { 00:14:48.361 "read": true, 00:14:48.361 "write": true, 00:14:48.361 "unmap": true, 00:14:48.361 "flush": true, 00:14:48.361 "reset": true, 00:14:48.361 "nvme_admin": false, 00:14:48.361 "nvme_io": false, 00:14:48.361 "nvme_io_md": false, 00:14:48.361 "write_zeroes": true, 00:14:48.361 "zcopy": true, 00:14:48.361 "get_zone_info": false, 00:14:48.361 "zone_management": false, 00:14:48.361 "zone_append": false, 00:14:48.361 "compare": false, 00:14:48.361 "compare_and_write": false, 00:14:48.361 "abort": true, 00:14:48.361 "seek_hole": false, 00:14:48.361 "seek_data": false, 00:14:48.361 "copy": true, 00:14:48.361 "nvme_iov_md": false 00:14:48.361 }, 00:14:48.361 "memory_domains": [ 00:14:48.361 { 00:14:48.361 "dma_device_id": "system", 00:14:48.361 "dma_device_type": 1 00:14:48.361 }, 00:14:48.361 { 00:14:48.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.361 "dma_device_type": 2 00:14:48.361 } 00:14:48.361 ], 00:14:48.361 "driver_specific": { 00:14:48.361 "passthru": { 00:14:48.361 "name": "pt2", 00:14:48.361 "base_bdev_name": "malloc2" 00:14:48.361 } 00:14:48.361 } 00:14:48.361 }' 00:14:48.361 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.361 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.361 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:48.361 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.633 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.633 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:48.633 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.633 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.633 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:48.633 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.633 20:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.892 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:48.892 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:48.892 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:48.892 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:48.892 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:48.892 "name": "pt3", 00:14:48.892 "aliases": [ 00:14:48.892 "00000000-0000-0000-0000-000000000003" 00:14:48.892 ], 00:14:48.892 "product_name": "passthru", 00:14:48.892 "block_size": 512, 00:14:48.892 "num_blocks": 65536, 00:14:48.892 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:48.892 "assigned_rate_limits": { 00:14:48.892 "rw_ios_per_sec": 0, 00:14:48.892 "rw_mbytes_per_sec": 0, 00:14:48.892 "r_mbytes_per_sec": 0, 00:14:48.892 "w_mbytes_per_sec": 0 00:14:48.892 }, 00:14:48.892 "claimed": true, 00:14:48.892 "claim_type": "exclusive_write", 00:14:48.892 "zoned": false, 00:14:48.892 "supported_io_types": { 00:14:48.892 "read": true, 00:14:48.892 "write": true, 00:14:48.892 "unmap": true, 00:14:48.892 "flush": true, 00:14:48.892 "reset": true, 00:14:48.892 "nvme_admin": false, 00:14:48.892 "nvme_io": false, 00:14:48.892 "nvme_io_md": false, 00:14:48.892 "write_zeroes": true, 00:14:48.892 "zcopy": true, 00:14:48.892 "get_zone_info": false, 00:14:48.892 "zone_management": false, 00:14:48.892 "zone_append": false, 00:14:48.892 "compare": false, 00:14:48.892 "compare_and_write": false, 00:14:48.892 "abort": true, 00:14:48.892 "seek_hole": false, 00:14:48.892 "seek_data": false, 00:14:48.892 "copy": true, 00:14:48.892 "nvme_iov_md": false 00:14:48.892 }, 00:14:48.892 "memory_domains": [ 00:14:48.892 { 00:14:48.892 "dma_device_id": "system", 00:14:48.892 "dma_device_type": 1 00:14:48.892 }, 00:14:48.892 { 00:14:48.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.892 "dma_device_type": 2 00:14:48.892 } 00:14:48.892 ], 00:14:48.892 "driver_specific": { 00:14:48.892 "passthru": { 00:14:48.892 "name": "pt3", 00:14:48.892 "base_bdev_name": "malloc3" 00:14:48.892 } 00:14:48.892 } 00:14:48.892 }' 00:14:48.892 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:49.150 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:49.150 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:49.150 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:49.150 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:49.150 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:49.150 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:49.150 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:49.408 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:49.408 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:49.408 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:49.408 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:49.408 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:49.408 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:49.666 [2024-07-15 20:28:41.869368] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:49.667 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=3caf4869-bdfe-4560-a908-a381f7b4516d 00:14:49.667 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 3caf4869-bdfe-4560-a908-a381f7b4516d ']' 00:14:49.667 20:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:49.925 [2024-07-15 20:28:42.113729] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:49.925 [2024-07-15 20:28:42.113751] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:49.925 [2024-07-15 20:28:42.113808] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:49.925 [2024-07-15 20:28:42.113862] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:49.925 [2024-07-15 20:28:42.113874] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xed0ea0 name raid_bdev1, state offline 00:14:49.925 20:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.925 20:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:50.183 20:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:50.183 20:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:50.183 20:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:50.183 20:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:50.442 20:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:50.443 20:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:50.702 20:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:50.702 20:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:50.961 20:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:50.961 20:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:51.220 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:51.478 [2024-07-15 20:28:43.605701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:51.478 [2024-07-15 20:28:43.607119] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:51.478 [2024-07-15 20:28:43.607166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:51.478 [2024-07-15 20:28:43.607214] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:51.478 [2024-07-15 20:28:43.607255] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:51.478 [2024-07-15 20:28:43.607278] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:51.478 [2024-07-15 20:28:43.607297] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:51.478 [2024-07-15 20:28:43.607308] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x107bff0 name raid_bdev1, state configuring 00:14:51.478 request: 00:14:51.478 { 00:14:51.478 "name": "raid_bdev1", 00:14:51.478 "raid_level": "raid0", 00:14:51.478 "base_bdevs": [ 00:14:51.478 "malloc1", 00:14:51.478 "malloc2", 00:14:51.478 "malloc3" 00:14:51.478 ], 00:14:51.478 "strip_size_kb": 64, 00:14:51.478 "superblock": false, 00:14:51.478 "method": "bdev_raid_create", 00:14:51.478 "req_id": 1 00:14:51.478 } 00:14:51.478 Got JSON-RPC error response 00:14:51.478 response: 00:14:51.478 { 00:14:51.478 "code": -17, 00:14:51.478 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:51.478 } 00:14:51.478 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:51.478 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:51.478 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:51.478 20:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:51.478 20:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.478 20:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:52.046 [2024-07-15 20:28:44.367637] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:52.046 [2024-07-15 20:28:44.367692] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:52.046 [2024-07-15 20:28:44.367713] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed87a0 00:14:52.046 [2024-07-15 20:28:44.367726] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:52.046 [2024-07-15 20:28:44.369367] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:52.046 [2024-07-15 20:28:44.369397] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:52.046 [2024-07-15 20:28:44.369468] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:52.046 [2024-07-15 20:28:44.369494] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:52.046 pt1 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.046 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:52.305 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.305 "name": "raid_bdev1", 00:14:52.305 "uuid": "3caf4869-bdfe-4560-a908-a381f7b4516d", 00:14:52.305 "strip_size_kb": 64, 00:14:52.305 "state": "configuring", 00:14:52.305 "raid_level": "raid0", 00:14:52.305 "superblock": true, 00:14:52.305 "num_base_bdevs": 3, 00:14:52.305 "num_base_bdevs_discovered": 1, 00:14:52.305 "num_base_bdevs_operational": 3, 00:14:52.305 "base_bdevs_list": [ 00:14:52.305 { 00:14:52.305 "name": "pt1", 00:14:52.305 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:52.305 "is_configured": true, 00:14:52.305 "data_offset": 2048, 00:14:52.305 "data_size": 63488 00:14:52.305 }, 00:14:52.305 { 00:14:52.305 "name": null, 00:14:52.305 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:52.305 "is_configured": false, 00:14:52.305 "data_offset": 2048, 00:14:52.305 "data_size": 63488 00:14:52.305 }, 00:14:52.305 { 00:14:52.305 "name": null, 00:14:52.305 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:52.305 "is_configured": false, 00:14:52.305 "data_offset": 2048, 00:14:52.305 "data_size": 63488 00:14:52.305 } 00:14:52.305 ] 00:14:52.305 }' 00:14:52.305 20:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.305 20:28:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.872 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:52.872 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:53.130 [2024-07-15 20:28:45.442513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:53.130 [2024-07-15 20:28:45.442564] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:53.130 [2024-07-15 20:28:45.442585] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xecfc70 00:14:53.130 [2024-07-15 20:28:45.442598] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:53.130 [2024-07-15 20:28:45.442965] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:53.130 [2024-07-15 20:28:45.442983] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:53.130 [2024-07-15 20:28:45.443053] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:53.130 [2024-07-15 20:28:45.443073] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:53.130 pt2 00:14:53.130 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:53.389 [2024-07-15 20:28:45.623008] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.389 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:53.649 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.649 "name": "raid_bdev1", 00:14:53.649 "uuid": "3caf4869-bdfe-4560-a908-a381f7b4516d", 00:14:53.649 "strip_size_kb": 64, 00:14:53.649 "state": "configuring", 00:14:53.649 "raid_level": "raid0", 00:14:53.649 "superblock": true, 00:14:53.649 "num_base_bdevs": 3, 00:14:53.649 "num_base_bdevs_discovered": 1, 00:14:53.649 "num_base_bdevs_operational": 3, 00:14:53.649 "base_bdevs_list": [ 00:14:53.649 { 00:14:53.649 "name": "pt1", 00:14:53.649 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:53.649 "is_configured": true, 00:14:53.649 "data_offset": 2048, 00:14:53.649 "data_size": 63488 00:14:53.649 }, 00:14:53.649 { 00:14:53.649 "name": null, 00:14:53.649 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:53.649 "is_configured": false, 00:14:53.649 "data_offset": 2048, 00:14:53.649 "data_size": 63488 00:14:53.649 }, 00:14:53.649 { 00:14:53.649 "name": null, 00:14:53.649 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:53.649 "is_configured": false, 00:14:53.649 "data_offset": 2048, 00:14:53.649 "data_size": 63488 00:14:53.649 } 00:14:53.649 ] 00:14:53.649 }' 00:14:53.649 20:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.649 20:28:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.217 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:54.217 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:54.217 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:54.217 [2024-07-15 20:28:46.573620] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:54.217 [2024-07-15 20:28:46.573676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:54.217 [2024-07-15 20:28:46.573695] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1070fa0 00:14:54.217 [2024-07-15 20:28:46.573707] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:54.217 [2024-07-15 20:28:46.574066] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:54.217 [2024-07-15 20:28:46.574084] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:54.217 [2024-07-15 20:28:46.574150] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:54.217 [2024-07-15 20:28:46.574169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:54.217 pt2 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:54.476 [2024-07-15 20:28:46.822283] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:54.476 [2024-07-15 20:28:46.822326] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:54.476 [2024-07-15 20:28:46.822344] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1071b30 00:14:54.476 [2024-07-15 20:28:46.822356] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:54.476 [2024-07-15 20:28:46.822655] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:54.476 [2024-07-15 20:28:46.822671] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:54.476 [2024-07-15 20:28:46.822725] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:54.476 [2024-07-15 20:28:46.822743] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:54.476 [2024-07-15 20:28:46.822847] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1072c00 00:14:54.476 [2024-07-15 20:28:46.822857] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:54.476 [2024-07-15 20:28:46.823031] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x107b9b0 00:14:54.476 [2024-07-15 20:28:46.823153] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1072c00 00:14:54.476 [2024-07-15 20:28:46.823162] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1072c00 00:14:54.476 [2024-07-15 20:28:46.823258] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:54.476 pt3 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.476 20:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:54.735 20:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.735 "name": "raid_bdev1", 00:14:54.735 "uuid": "3caf4869-bdfe-4560-a908-a381f7b4516d", 00:14:54.735 "strip_size_kb": 64, 00:14:54.735 "state": "online", 00:14:54.735 "raid_level": "raid0", 00:14:54.735 "superblock": true, 00:14:54.735 "num_base_bdevs": 3, 00:14:54.735 "num_base_bdevs_discovered": 3, 00:14:54.735 "num_base_bdevs_operational": 3, 00:14:54.735 "base_bdevs_list": [ 00:14:54.735 { 00:14:54.735 "name": "pt1", 00:14:54.735 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:54.735 "is_configured": true, 00:14:54.735 "data_offset": 2048, 00:14:54.735 "data_size": 63488 00:14:54.735 }, 00:14:54.735 { 00:14:54.735 "name": "pt2", 00:14:54.735 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:54.735 "is_configured": true, 00:14:54.735 "data_offset": 2048, 00:14:54.735 "data_size": 63488 00:14:54.735 }, 00:14:54.735 { 00:14:54.735 "name": "pt3", 00:14:54.735 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:54.735 "is_configured": true, 00:14:54.735 "data_offset": 2048, 00:14:54.735 "data_size": 63488 00:14:54.735 } 00:14:54.735 ] 00:14:54.735 }' 00:14:54.735 20:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.735 20:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.672 20:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:55.672 20:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:55.672 20:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:55.672 20:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:55.672 20:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:55.672 20:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:55.672 20:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:55.672 20:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:55.930 [2024-07-15 20:28:48.198403] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:55.930 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:55.930 "name": "raid_bdev1", 00:14:55.930 "aliases": [ 00:14:55.930 "3caf4869-bdfe-4560-a908-a381f7b4516d" 00:14:55.930 ], 00:14:55.930 "product_name": "Raid Volume", 00:14:55.930 "block_size": 512, 00:14:55.930 "num_blocks": 190464, 00:14:55.931 "uuid": "3caf4869-bdfe-4560-a908-a381f7b4516d", 00:14:55.931 "assigned_rate_limits": { 00:14:55.931 "rw_ios_per_sec": 0, 00:14:55.931 "rw_mbytes_per_sec": 0, 00:14:55.931 "r_mbytes_per_sec": 0, 00:14:55.931 "w_mbytes_per_sec": 0 00:14:55.931 }, 00:14:55.931 "claimed": false, 00:14:55.931 "zoned": false, 00:14:55.931 "supported_io_types": { 00:14:55.931 "read": true, 00:14:55.931 "write": true, 00:14:55.931 "unmap": true, 00:14:55.931 "flush": true, 00:14:55.931 "reset": true, 00:14:55.931 "nvme_admin": false, 00:14:55.931 "nvme_io": false, 00:14:55.931 "nvme_io_md": false, 00:14:55.931 "write_zeroes": true, 00:14:55.931 "zcopy": false, 00:14:55.931 "get_zone_info": false, 00:14:55.931 "zone_management": false, 00:14:55.931 "zone_append": false, 00:14:55.931 "compare": false, 00:14:55.931 "compare_and_write": false, 00:14:55.931 "abort": false, 00:14:55.931 "seek_hole": false, 00:14:55.931 "seek_data": false, 00:14:55.931 "copy": false, 00:14:55.931 "nvme_iov_md": false 00:14:55.931 }, 00:14:55.931 "memory_domains": [ 00:14:55.931 { 00:14:55.931 "dma_device_id": "system", 00:14:55.931 "dma_device_type": 1 00:14:55.931 }, 00:14:55.931 { 00:14:55.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.931 "dma_device_type": 2 00:14:55.931 }, 00:14:55.931 { 00:14:55.931 "dma_device_id": "system", 00:14:55.931 "dma_device_type": 1 00:14:55.931 }, 00:14:55.931 { 00:14:55.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.931 "dma_device_type": 2 00:14:55.931 }, 00:14:55.931 { 00:14:55.931 "dma_device_id": "system", 00:14:55.931 "dma_device_type": 1 00:14:55.931 }, 00:14:55.931 { 00:14:55.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.931 "dma_device_type": 2 00:14:55.931 } 00:14:55.931 ], 00:14:55.931 "driver_specific": { 00:14:55.931 "raid": { 00:14:55.931 "uuid": "3caf4869-bdfe-4560-a908-a381f7b4516d", 00:14:55.931 "strip_size_kb": 64, 00:14:55.931 "state": "online", 00:14:55.931 "raid_level": "raid0", 00:14:55.931 "superblock": true, 00:14:55.931 "num_base_bdevs": 3, 00:14:55.931 "num_base_bdevs_discovered": 3, 00:14:55.931 "num_base_bdevs_operational": 3, 00:14:55.931 "base_bdevs_list": [ 00:14:55.931 { 00:14:55.931 "name": "pt1", 00:14:55.931 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:55.931 "is_configured": true, 00:14:55.931 "data_offset": 2048, 00:14:55.931 "data_size": 63488 00:14:55.931 }, 00:14:55.931 { 00:14:55.931 "name": "pt2", 00:14:55.931 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:55.931 "is_configured": true, 00:14:55.931 "data_offset": 2048, 00:14:55.931 "data_size": 63488 00:14:55.931 }, 00:14:55.931 { 00:14:55.931 "name": "pt3", 00:14:55.931 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:55.931 "is_configured": true, 00:14:55.931 "data_offset": 2048, 00:14:55.931 "data_size": 63488 00:14:55.931 } 00:14:55.931 ] 00:14:55.931 } 00:14:55.931 } 00:14:55.931 }' 00:14:55.931 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:55.931 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:55.931 pt2 00:14:55.931 pt3' 00:14:55.931 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:56.189 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:56.189 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:56.189 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:56.189 "name": "pt1", 00:14:56.189 "aliases": [ 00:14:56.189 "00000000-0000-0000-0000-000000000001" 00:14:56.189 ], 00:14:56.189 "product_name": "passthru", 00:14:56.189 "block_size": 512, 00:14:56.189 "num_blocks": 65536, 00:14:56.189 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:56.189 "assigned_rate_limits": { 00:14:56.189 "rw_ios_per_sec": 0, 00:14:56.189 "rw_mbytes_per_sec": 0, 00:14:56.189 "r_mbytes_per_sec": 0, 00:14:56.189 "w_mbytes_per_sec": 0 00:14:56.189 }, 00:14:56.189 "claimed": true, 00:14:56.189 "claim_type": "exclusive_write", 00:14:56.189 "zoned": false, 00:14:56.189 "supported_io_types": { 00:14:56.189 "read": true, 00:14:56.189 "write": true, 00:14:56.189 "unmap": true, 00:14:56.189 "flush": true, 00:14:56.189 "reset": true, 00:14:56.189 "nvme_admin": false, 00:14:56.189 "nvme_io": false, 00:14:56.189 "nvme_io_md": false, 00:14:56.189 "write_zeroes": true, 00:14:56.189 "zcopy": true, 00:14:56.189 "get_zone_info": false, 00:14:56.189 "zone_management": false, 00:14:56.189 "zone_append": false, 00:14:56.189 "compare": false, 00:14:56.189 "compare_and_write": false, 00:14:56.189 "abort": true, 00:14:56.189 "seek_hole": false, 00:14:56.189 "seek_data": false, 00:14:56.189 "copy": true, 00:14:56.189 "nvme_iov_md": false 00:14:56.189 }, 00:14:56.189 "memory_domains": [ 00:14:56.189 { 00:14:56.189 "dma_device_id": "system", 00:14:56.189 "dma_device_type": 1 00:14:56.189 }, 00:14:56.189 { 00:14:56.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.189 "dma_device_type": 2 00:14:56.189 } 00:14:56.189 ], 00:14:56.189 "driver_specific": { 00:14:56.189 "passthru": { 00:14:56.189 "name": "pt1", 00:14:56.189 "base_bdev_name": "malloc1" 00:14:56.189 } 00:14:56.189 } 00:14:56.189 }' 00:14:56.189 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.447 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.447 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:56.447 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.447 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.447 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:56.447 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.447 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.706 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:56.706 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.706 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.706 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:56.706 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:56.706 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:56.706 20:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:57.274 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:57.275 "name": "pt2", 00:14:57.275 "aliases": [ 00:14:57.275 "00000000-0000-0000-0000-000000000002" 00:14:57.275 ], 00:14:57.275 "product_name": "passthru", 00:14:57.275 "block_size": 512, 00:14:57.275 "num_blocks": 65536, 00:14:57.275 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:57.275 "assigned_rate_limits": { 00:14:57.275 "rw_ios_per_sec": 0, 00:14:57.275 "rw_mbytes_per_sec": 0, 00:14:57.275 "r_mbytes_per_sec": 0, 00:14:57.275 "w_mbytes_per_sec": 0 00:14:57.275 }, 00:14:57.275 "claimed": true, 00:14:57.275 "claim_type": "exclusive_write", 00:14:57.275 "zoned": false, 00:14:57.275 "supported_io_types": { 00:14:57.275 "read": true, 00:14:57.275 "write": true, 00:14:57.275 "unmap": true, 00:14:57.275 "flush": true, 00:14:57.275 "reset": true, 00:14:57.275 "nvme_admin": false, 00:14:57.275 "nvme_io": false, 00:14:57.275 "nvme_io_md": false, 00:14:57.275 "write_zeroes": true, 00:14:57.275 "zcopy": true, 00:14:57.275 "get_zone_info": false, 00:14:57.275 "zone_management": false, 00:14:57.275 "zone_append": false, 00:14:57.275 "compare": false, 00:14:57.275 "compare_and_write": false, 00:14:57.275 "abort": true, 00:14:57.275 "seek_hole": false, 00:14:57.275 "seek_data": false, 00:14:57.275 "copy": true, 00:14:57.275 "nvme_iov_md": false 00:14:57.275 }, 00:14:57.275 "memory_domains": [ 00:14:57.275 { 00:14:57.275 "dma_device_id": "system", 00:14:57.275 "dma_device_type": 1 00:14:57.275 }, 00:14:57.275 { 00:14:57.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.275 "dma_device_type": 2 00:14:57.275 } 00:14:57.275 ], 00:14:57.275 "driver_specific": { 00:14:57.275 "passthru": { 00:14:57.275 "name": "pt2", 00:14:57.275 "base_bdev_name": "malloc2" 00:14:57.275 } 00:14:57.275 } 00:14:57.275 }' 00:14:57.275 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.275 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.275 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:57.275 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.533 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.533 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:57.533 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.533 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.533 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:57.533 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.533 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.791 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:57.791 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:57.791 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:57.791 20:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.050 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.050 "name": "pt3", 00:14:58.050 "aliases": [ 00:14:58.050 "00000000-0000-0000-0000-000000000003" 00:14:58.050 ], 00:14:58.050 "product_name": "passthru", 00:14:58.050 "block_size": 512, 00:14:58.050 "num_blocks": 65536, 00:14:58.050 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:58.050 "assigned_rate_limits": { 00:14:58.050 "rw_ios_per_sec": 0, 00:14:58.050 "rw_mbytes_per_sec": 0, 00:14:58.050 "r_mbytes_per_sec": 0, 00:14:58.050 "w_mbytes_per_sec": 0 00:14:58.050 }, 00:14:58.050 "claimed": true, 00:14:58.050 "claim_type": "exclusive_write", 00:14:58.050 "zoned": false, 00:14:58.050 "supported_io_types": { 00:14:58.050 "read": true, 00:14:58.050 "write": true, 00:14:58.050 "unmap": true, 00:14:58.050 "flush": true, 00:14:58.050 "reset": true, 00:14:58.050 "nvme_admin": false, 00:14:58.050 "nvme_io": false, 00:14:58.050 "nvme_io_md": false, 00:14:58.050 "write_zeroes": true, 00:14:58.050 "zcopy": true, 00:14:58.050 "get_zone_info": false, 00:14:58.050 "zone_management": false, 00:14:58.050 "zone_append": false, 00:14:58.050 "compare": false, 00:14:58.050 "compare_and_write": false, 00:14:58.050 "abort": true, 00:14:58.050 "seek_hole": false, 00:14:58.050 "seek_data": false, 00:14:58.050 "copy": true, 00:14:58.050 "nvme_iov_md": false 00:14:58.050 }, 00:14:58.050 "memory_domains": [ 00:14:58.050 { 00:14:58.050 "dma_device_id": "system", 00:14:58.050 "dma_device_type": 1 00:14:58.050 }, 00:14:58.050 { 00:14:58.050 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.050 "dma_device_type": 2 00:14:58.050 } 00:14:58.050 ], 00:14:58.050 "driver_specific": { 00:14:58.050 "passthru": { 00:14:58.050 "name": "pt3", 00:14:58.050 "base_bdev_name": "malloc3" 00:14:58.050 } 00:14:58.050 } 00:14:58.050 }' 00:14:58.050 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.050 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.050 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.050 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.050 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.309 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.309 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.309 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.309 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.309 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.309 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.567 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.567 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:58.568 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:58.827 [2024-07-15 20:28:50.949725] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:58.827 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 3caf4869-bdfe-4560-a908-a381f7b4516d '!=' 3caf4869-bdfe-4560-a908-a381f7b4516d ']' 00:14:58.827 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:14:58.827 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:58.827 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:58.827 20:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1379489 00:14:58.827 20:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1379489 ']' 00:14:58.827 20:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1379489 00:14:58.827 20:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:58.827 20:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:58.827 20:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1379489 00:14:58.827 20:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:58.827 20:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:58.827 20:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1379489' 00:14:58.827 killing process with pid 1379489 00:14:58.827 20:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1379489 00:14:58.827 [2024-07-15 20:28:51.017675] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:58.827 [2024-07-15 20:28:51.017734] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:58.827 [2024-07-15 20:28:51.017784] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:58.827 [2024-07-15 20:28:51.017797] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1072c00 name raid_bdev1, state offline 00:14:58.827 20:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1379489 00:14:58.827 [2024-07-15 20:28:51.044392] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:59.086 20:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:59.086 00:14:59.086 real 0m15.652s 00:14:59.086 user 0m28.563s 00:14:59.086 sys 0m2.624s 00:14:59.086 20:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:59.086 20:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.086 ************************************ 00:14:59.086 END TEST raid_superblock_test 00:14:59.086 ************************************ 00:14:59.086 20:28:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:59.086 20:28:51 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:14:59.086 20:28:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:59.086 20:28:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:59.086 20:28:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:59.086 ************************************ 00:14:59.086 START TEST raid_read_error_test 00:14:59.086 ************************************ 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:59.086 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ngCAKy9LaA 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1381806 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1381806 /var/tmp/spdk-raid.sock 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1381806 ']' 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:59.087 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:59.087 20:28:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.087 [2024-07-15 20:28:51.410710] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:14:59.087 [2024-07-15 20:28:51.410773] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1381806 ] 00:14:59.346 [2024-07-15 20:28:51.540975] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:59.346 [2024-07-15 20:28:51.637868] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:59.346 [2024-07-15 20:28:51.701334] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:59.346 [2024-07-15 20:28:51.701375] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:00.281 20:28:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:00.281 20:28:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:00.281 20:28:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:00.281 20:28:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:00.281 BaseBdev1_malloc 00:15:00.281 20:28:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:00.849 true 00:15:00.849 20:28:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:01.107 [2024-07-15 20:28:53.380675] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:01.107 [2024-07-15 20:28:53.380724] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:01.107 [2024-07-15 20:28:53.380747] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xddd0d0 00:15:01.107 [2024-07-15 20:28:53.380760] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:01.107 [2024-07-15 20:28:53.382697] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:01.107 [2024-07-15 20:28:53.382727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:01.107 BaseBdev1 00:15:01.107 20:28:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:01.107 20:28:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:01.364 BaseBdev2_malloc 00:15:01.364 20:28:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:01.622 true 00:15:01.622 20:28:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:01.880 [2024-07-15 20:28:54.156597] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:01.880 [2024-07-15 20:28:54.156642] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:01.880 [2024-07-15 20:28:54.156664] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde1910 00:15:01.880 [2024-07-15 20:28:54.156677] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:01.880 [2024-07-15 20:28:54.158250] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:01.880 [2024-07-15 20:28:54.158279] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:01.880 BaseBdev2 00:15:01.880 20:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:01.880 20:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:02.139 BaseBdev3_malloc 00:15:02.139 20:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:02.397 true 00:15:02.397 20:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:02.656 [2024-07-15 20:28:54.927320] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:02.656 [2024-07-15 20:28:54.927366] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.656 [2024-07-15 20:28:54.927388] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde3bd0 00:15:02.656 [2024-07-15 20:28:54.927400] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.656 [2024-07-15 20:28:54.928993] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.656 [2024-07-15 20:28:54.929021] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:02.656 BaseBdev3 00:15:02.656 20:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:02.915 [2024-07-15 20:28:55.220137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:02.915 [2024-07-15 20:28:55.221514] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:02.915 [2024-07-15 20:28:55.221586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:02.915 [2024-07-15 20:28:55.221798] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xde5280 00:15:02.915 [2024-07-15 20:28:55.221810] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:02.915 [2024-07-15 20:28:55.222027] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xde4e20 00:15:02.915 [2024-07-15 20:28:55.222180] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xde5280 00:15:02.915 [2024-07-15 20:28:55.222190] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xde5280 00:15:02.915 [2024-07-15 20:28:55.222300] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.915 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:03.174 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.174 "name": "raid_bdev1", 00:15:03.174 "uuid": "27e08cfa-8534-41bc-8317-da72bc143179", 00:15:03.174 "strip_size_kb": 64, 00:15:03.174 "state": "online", 00:15:03.174 "raid_level": "raid0", 00:15:03.174 "superblock": true, 00:15:03.174 "num_base_bdevs": 3, 00:15:03.174 "num_base_bdevs_discovered": 3, 00:15:03.174 "num_base_bdevs_operational": 3, 00:15:03.174 "base_bdevs_list": [ 00:15:03.174 { 00:15:03.174 "name": "BaseBdev1", 00:15:03.174 "uuid": "d90d71de-cad6-5b2f-85e7-2c7b7b5136e5", 00:15:03.174 "is_configured": true, 00:15:03.174 "data_offset": 2048, 00:15:03.174 "data_size": 63488 00:15:03.174 }, 00:15:03.174 { 00:15:03.174 "name": "BaseBdev2", 00:15:03.174 "uuid": "7febf420-1f3c-5035-b464-846de9b88455", 00:15:03.174 "is_configured": true, 00:15:03.174 "data_offset": 2048, 00:15:03.174 "data_size": 63488 00:15:03.174 }, 00:15:03.174 { 00:15:03.174 "name": "BaseBdev3", 00:15:03.174 "uuid": "360d1976-3065-50f1-bdb5-af5743b03e55", 00:15:03.174 "is_configured": true, 00:15:03.174 "data_offset": 2048, 00:15:03.174 "data_size": 63488 00:15:03.174 } 00:15:03.174 ] 00:15:03.174 }' 00:15:03.174 20:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.174 20:28:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.110 20:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:04.110 20:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:04.110 [2024-07-15 20:28:56.251139] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc335b0 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.049 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:05.327 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:05.327 "name": "raid_bdev1", 00:15:05.327 "uuid": "27e08cfa-8534-41bc-8317-da72bc143179", 00:15:05.327 "strip_size_kb": 64, 00:15:05.327 "state": "online", 00:15:05.327 "raid_level": "raid0", 00:15:05.327 "superblock": true, 00:15:05.327 "num_base_bdevs": 3, 00:15:05.327 "num_base_bdevs_discovered": 3, 00:15:05.327 "num_base_bdevs_operational": 3, 00:15:05.327 "base_bdevs_list": [ 00:15:05.327 { 00:15:05.327 "name": "BaseBdev1", 00:15:05.327 "uuid": "d90d71de-cad6-5b2f-85e7-2c7b7b5136e5", 00:15:05.327 "is_configured": true, 00:15:05.327 "data_offset": 2048, 00:15:05.327 "data_size": 63488 00:15:05.327 }, 00:15:05.327 { 00:15:05.327 "name": "BaseBdev2", 00:15:05.327 "uuid": "7febf420-1f3c-5035-b464-846de9b88455", 00:15:05.327 "is_configured": true, 00:15:05.327 "data_offset": 2048, 00:15:05.327 "data_size": 63488 00:15:05.327 }, 00:15:05.327 { 00:15:05.327 "name": "BaseBdev3", 00:15:05.327 "uuid": "360d1976-3065-50f1-bdb5-af5743b03e55", 00:15:05.327 "is_configured": true, 00:15:05.327 "data_offset": 2048, 00:15:05.327 "data_size": 63488 00:15:05.327 } 00:15:05.327 ] 00:15:05.327 }' 00:15:05.327 20:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:05.327 20:28:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:05.903 20:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:06.162 [2024-07-15 20:28:58.477261] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:06.162 [2024-07-15 20:28:58.477303] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:06.163 [2024-07-15 20:28:58.480654] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:06.163 [2024-07-15 20:28:58.480695] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:06.163 [2024-07-15 20:28:58.480738] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:06.163 [2024-07-15 20:28:58.480750] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xde5280 name raid_bdev1, state offline 00:15:06.163 0 00:15:06.163 20:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1381806 00:15:06.163 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1381806 ']' 00:15:06.163 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1381806 00:15:06.163 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:06.163 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:06.163 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1381806 00:15:06.422 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:06.422 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:06.422 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1381806' 00:15:06.422 killing process with pid 1381806 00:15:06.422 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1381806 00:15:06.422 [2024-07-15 20:28:58.551835] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:06.422 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1381806 00:15:06.422 [2024-07-15 20:28:58.573235] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:06.422 20:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ngCAKy9LaA 00:15:06.422 20:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:06.681 20:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:06.681 20:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:15:06.681 20:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:06.681 20:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:06.681 20:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:06.681 20:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:15:06.681 00:15:06.681 real 0m7.480s 00:15:06.681 user 0m12.034s 00:15:06.681 sys 0m1.256s 00:15:06.681 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:06.681 20:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.681 ************************************ 00:15:06.681 END TEST raid_read_error_test 00:15:06.681 ************************************ 00:15:06.681 20:28:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:06.681 20:28:58 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:15:06.681 20:28:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:06.681 20:28:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:06.681 20:28:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:06.681 ************************************ 00:15:06.681 START TEST raid_write_error_test 00:15:06.681 ************************************ 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:06.681 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.G5lb3KZ07G 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1382952 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1382952 /var/tmp/spdk-raid.sock 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1382952 ']' 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:06.682 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:06.682 20:28:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.682 [2024-07-15 20:28:58.972519] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:15:06.682 [2024-07-15 20:28:58.972572] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1382952 ] 00:15:06.941 [2024-07-15 20:28:59.080181] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:06.941 [2024-07-15 20:28:59.181263] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:06.941 [2024-07-15 20:28:59.242897] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:06.941 [2024-07-15 20:28:59.242951] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:07.877 20:28:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:07.877 20:28:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:07.877 20:28:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:07.877 20:28:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:07.877 BaseBdev1_malloc 00:15:07.877 20:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:08.136 true 00:15:08.136 20:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:08.394 [2024-07-15 20:29:00.651569] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:08.394 [2024-07-15 20:29:00.651617] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:08.394 [2024-07-15 20:29:00.651637] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc110d0 00:15:08.394 [2024-07-15 20:29:00.651650] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:08.394 [2024-07-15 20:29:00.653381] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:08.394 [2024-07-15 20:29:00.653408] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:08.394 BaseBdev1 00:15:08.394 20:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:08.394 20:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:08.652 BaseBdev2_malloc 00:15:08.652 20:29:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:08.910 true 00:15:08.910 20:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:09.169 [2024-07-15 20:29:01.386069] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:09.169 [2024-07-15 20:29:01.386113] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:09.169 [2024-07-15 20:29:01.386135] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc15910 00:15:09.169 [2024-07-15 20:29:01.386147] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:09.169 [2024-07-15 20:29:01.387619] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:09.169 [2024-07-15 20:29:01.387647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:09.169 BaseBdev2 00:15:09.169 20:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:09.169 20:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:09.427 BaseBdev3_malloc 00:15:09.427 20:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:09.686 true 00:15:09.686 20:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:09.945 [2024-07-15 20:29:02.128556] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:09.945 [2024-07-15 20:29:02.128599] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:09.945 [2024-07-15 20:29:02.128619] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc17bd0 00:15:09.945 [2024-07-15 20:29:02.128632] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:09.945 [2024-07-15 20:29:02.130059] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:09.945 [2024-07-15 20:29:02.130087] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:09.945 BaseBdev3 00:15:09.945 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:10.203 [2024-07-15 20:29:02.377244] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:10.203 [2024-07-15 20:29:02.378428] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:10.203 [2024-07-15 20:29:02.378495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:10.203 [2024-07-15 20:29:02.378699] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc19280 00:15:10.203 [2024-07-15 20:29:02.378711] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:10.203 [2024-07-15 20:29:02.378890] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc18e20 00:15:10.203 [2024-07-15 20:29:02.379038] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc19280 00:15:10.203 [2024-07-15 20:29:02.379049] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc19280 00:15:10.203 [2024-07-15 20:29:02.379145] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.203 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:10.462 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.462 "name": "raid_bdev1", 00:15:10.462 "uuid": "fe139eb5-9938-4690-85de-49de2897f5cb", 00:15:10.462 "strip_size_kb": 64, 00:15:10.462 "state": "online", 00:15:10.462 "raid_level": "raid0", 00:15:10.462 "superblock": true, 00:15:10.462 "num_base_bdevs": 3, 00:15:10.462 "num_base_bdevs_discovered": 3, 00:15:10.462 "num_base_bdevs_operational": 3, 00:15:10.462 "base_bdevs_list": [ 00:15:10.462 { 00:15:10.462 "name": "BaseBdev1", 00:15:10.462 "uuid": "3b47e3dd-5e62-5486-957b-ac365b245b24", 00:15:10.462 "is_configured": true, 00:15:10.462 "data_offset": 2048, 00:15:10.462 "data_size": 63488 00:15:10.462 }, 00:15:10.462 { 00:15:10.462 "name": "BaseBdev2", 00:15:10.462 "uuid": "5e6169b1-23c2-5d2a-8ba3-039ab7b66b1a", 00:15:10.462 "is_configured": true, 00:15:10.462 "data_offset": 2048, 00:15:10.462 "data_size": 63488 00:15:10.462 }, 00:15:10.462 { 00:15:10.462 "name": "BaseBdev3", 00:15:10.462 "uuid": "a63e487f-1ec2-5e25-9f75-d213965a599e", 00:15:10.462 "is_configured": true, 00:15:10.462 "data_offset": 2048, 00:15:10.462 "data_size": 63488 00:15:10.462 } 00:15:10.462 ] 00:15:10.462 }' 00:15:10.462 20:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.462 20:29:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.029 20:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:11.029 20:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:11.029 [2024-07-15 20:29:03.348103] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa675b0 00:15:11.966 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.225 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:12.484 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:12.484 "name": "raid_bdev1", 00:15:12.484 "uuid": "fe139eb5-9938-4690-85de-49de2897f5cb", 00:15:12.484 "strip_size_kb": 64, 00:15:12.484 "state": "online", 00:15:12.484 "raid_level": "raid0", 00:15:12.484 "superblock": true, 00:15:12.484 "num_base_bdevs": 3, 00:15:12.484 "num_base_bdevs_discovered": 3, 00:15:12.484 "num_base_bdevs_operational": 3, 00:15:12.484 "base_bdevs_list": [ 00:15:12.484 { 00:15:12.484 "name": "BaseBdev1", 00:15:12.484 "uuid": "3b47e3dd-5e62-5486-957b-ac365b245b24", 00:15:12.484 "is_configured": true, 00:15:12.484 "data_offset": 2048, 00:15:12.484 "data_size": 63488 00:15:12.484 }, 00:15:12.484 { 00:15:12.484 "name": "BaseBdev2", 00:15:12.484 "uuid": "5e6169b1-23c2-5d2a-8ba3-039ab7b66b1a", 00:15:12.484 "is_configured": true, 00:15:12.484 "data_offset": 2048, 00:15:12.484 "data_size": 63488 00:15:12.484 }, 00:15:12.484 { 00:15:12.484 "name": "BaseBdev3", 00:15:12.484 "uuid": "a63e487f-1ec2-5e25-9f75-d213965a599e", 00:15:12.484 "is_configured": true, 00:15:12.484 "data_offset": 2048, 00:15:12.484 "data_size": 63488 00:15:12.484 } 00:15:12.484 ] 00:15:12.484 }' 00:15:12.484 20:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:12.484 20:29:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.052 20:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:13.312 [2024-07-15 20:29:05.500822] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:13.312 [2024-07-15 20:29:05.500869] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:13.312 [2024-07-15 20:29:05.504059] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:13.312 [2024-07-15 20:29:05.504097] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:13.312 [2024-07-15 20:29:05.504134] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:13.312 [2024-07-15 20:29:05.504146] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc19280 name raid_bdev1, state offline 00:15:13.312 0 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1382952 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1382952 ']' 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1382952 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1382952 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1382952' 00:15:13.312 killing process with pid 1382952 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1382952 00:15:13.312 [2024-07-15 20:29:05.582773] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:13.312 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1382952 00:15:13.312 [2024-07-15 20:29:05.604141] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:13.571 20:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.G5lb3KZ07G 00:15:13.571 20:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:13.571 20:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:13.571 20:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:15:13.571 20:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:13.571 20:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:13.571 20:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:13.571 20:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:15:13.571 00:15:13.571 real 0m6.942s 00:15:13.571 user 0m11.044s 00:15:13.571 sys 0m1.175s 00:15:13.571 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:13.571 20:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.571 ************************************ 00:15:13.571 END TEST raid_write_error_test 00:15:13.571 ************************************ 00:15:13.571 20:29:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:13.571 20:29:05 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:13.571 20:29:05 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:15:13.571 20:29:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:13.571 20:29:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:13.571 20:29:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:13.572 ************************************ 00:15:13.572 START TEST raid_state_function_test 00:15:13.572 ************************************ 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1383934 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1383934' 00:15:13.572 Process raid pid: 1383934 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1383934 /var/tmp/spdk-raid.sock 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1383934 ']' 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:13.572 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:13.572 20:29:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.849 [2024-07-15 20:29:05.996666] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:15:13.849 [2024-07-15 20:29:05.996731] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:13.849 [2024-07-15 20:29:06.115695] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:13.849 [2024-07-15 20:29:06.221040] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:14.107 [2024-07-15 20:29:06.274768] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:14.107 [2024-07-15 20:29:06.274799] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:14.107 20:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:14.107 20:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:14.107 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:14.366 [2024-07-15 20:29:06.676923] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:14.366 [2024-07-15 20:29:06.676975] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:14.366 [2024-07-15 20:29:06.676986] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:14.366 [2024-07-15 20:29:06.676998] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:14.366 [2024-07-15 20:29:06.677006] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:14.366 [2024-07-15 20:29:06.677021] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.366 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.625 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.625 "name": "Existed_Raid", 00:15:14.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.625 "strip_size_kb": 64, 00:15:14.625 "state": "configuring", 00:15:14.625 "raid_level": "concat", 00:15:14.625 "superblock": false, 00:15:14.625 "num_base_bdevs": 3, 00:15:14.625 "num_base_bdevs_discovered": 0, 00:15:14.625 "num_base_bdevs_operational": 3, 00:15:14.625 "base_bdevs_list": [ 00:15:14.625 { 00:15:14.625 "name": "BaseBdev1", 00:15:14.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.625 "is_configured": false, 00:15:14.625 "data_offset": 0, 00:15:14.625 "data_size": 0 00:15:14.625 }, 00:15:14.625 { 00:15:14.625 "name": "BaseBdev2", 00:15:14.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.625 "is_configured": false, 00:15:14.625 "data_offset": 0, 00:15:14.625 "data_size": 0 00:15:14.625 }, 00:15:14.625 { 00:15:14.625 "name": "BaseBdev3", 00:15:14.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.625 "is_configured": false, 00:15:14.625 "data_offset": 0, 00:15:14.625 "data_size": 0 00:15:14.625 } 00:15:14.625 ] 00:15:14.625 }' 00:15:14.625 20:29:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.625 20:29:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:15.190 20:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:15.448 [2024-07-15 20:29:07.763657] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:15.448 [2024-07-15 20:29:07.763688] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf41a80 name Existed_Raid, state configuring 00:15:15.448 20:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:15.705 [2024-07-15 20:29:08.012348] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:15.705 [2024-07-15 20:29:08.012384] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:15.705 [2024-07-15 20:29:08.012394] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:15.705 [2024-07-15 20:29:08.012406] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:15.705 [2024-07-15 20:29:08.012415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:15.705 [2024-07-15 20:29:08.012426] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:15.705 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:15.964 [2024-07-15 20:29:08.266897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:15.964 BaseBdev1 00:15:15.964 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:15.964 20:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:15.964 20:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:15.964 20:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:15.964 20:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:15.964 20:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:15.964 20:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:16.221 20:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:16.479 [ 00:15:16.479 { 00:15:16.479 "name": "BaseBdev1", 00:15:16.479 "aliases": [ 00:15:16.479 "15bde8f0-6000-42b5-91b6-ef51e1b7f31b" 00:15:16.479 ], 00:15:16.479 "product_name": "Malloc disk", 00:15:16.479 "block_size": 512, 00:15:16.479 "num_blocks": 65536, 00:15:16.479 "uuid": "15bde8f0-6000-42b5-91b6-ef51e1b7f31b", 00:15:16.479 "assigned_rate_limits": { 00:15:16.479 "rw_ios_per_sec": 0, 00:15:16.479 "rw_mbytes_per_sec": 0, 00:15:16.479 "r_mbytes_per_sec": 0, 00:15:16.479 "w_mbytes_per_sec": 0 00:15:16.479 }, 00:15:16.479 "claimed": true, 00:15:16.479 "claim_type": "exclusive_write", 00:15:16.479 "zoned": false, 00:15:16.479 "supported_io_types": { 00:15:16.479 "read": true, 00:15:16.479 "write": true, 00:15:16.479 "unmap": true, 00:15:16.479 "flush": true, 00:15:16.479 "reset": true, 00:15:16.479 "nvme_admin": false, 00:15:16.479 "nvme_io": false, 00:15:16.479 "nvme_io_md": false, 00:15:16.479 "write_zeroes": true, 00:15:16.479 "zcopy": true, 00:15:16.479 "get_zone_info": false, 00:15:16.479 "zone_management": false, 00:15:16.479 "zone_append": false, 00:15:16.479 "compare": false, 00:15:16.479 "compare_and_write": false, 00:15:16.479 "abort": true, 00:15:16.479 "seek_hole": false, 00:15:16.479 "seek_data": false, 00:15:16.479 "copy": true, 00:15:16.479 "nvme_iov_md": false 00:15:16.479 }, 00:15:16.479 "memory_domains": [ 00:15:16.479 { 00:15:16.479 "dma_device_id": "system", 00:15:16.479 "dma_device_type": 1 00:15:16.479 }, 00:15:16.479 { 00:15:16.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.479 "dma_device_type": 2 00:15:16.479 } 00:15:16.479 ], 00:15:16.479 "driver_specific": {} 00:15:16.479 } 00:15:16.479 ] 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.479 20:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.737 20:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.737 "name": "Existed_Raid", 00:15:16.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.737 "strip_size_kb": 64, 00:15:16.737 "state": "configuring", 00:15:16.737 "raid_level": "concat", 00:15:16.737 "superblock": false, 00:15:16.737 "num_base_bdevs": 3, 00:15:16.737 "num_base_bdevs_discovered": 1, 00:15:16.737 "num_base_bdevs_operational": 3, 00:15:16.737 "base_bdevs_list": [ 00:15:16.737 { 00:15:16.737 "name": "BaseBdev1", 00:15:16.737 "uuid": "15bde8f0-6000-42b5-91b6-ef51e1b7f31b", 00:15:16.737 "is_configured": true, 00:15:16.737 "data_offset": 0, 00:15:16.737 "data_size": 65536 00:15:16.737 }, 00:15:16.737 { 00:15:16.737 "name": "BaseBdev2", 00:15:16.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.737 "is_configured": false, 00:15:16.737 "data_offset": 0, 00:15:16.737 "data_size": 0 00:15:16.737 }, 00:15:16.737 { 00:15:16.737 "name": "BaseBdev3", 00:15:16.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.737 "is_configured": false, 00:15:16.737 "data_offset": 0, 00:15:16.737 "data_size": 0 00:15:16.737 } 00:15:16.737 ] 00:15:16.737 }' 00:15:16.737 20:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.737 20:29:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:17.304 20:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:17.563 [2024-07-15 20:29:09.863152] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:17.563 [2024-07-15 20:29:09.863195] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf41310 name Existed_Raid, state configuring 00:15:17.563 20:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:17.821 [2024-07-15 20:29:10.107849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:17.821 [2024-07-15 20:29:10.109355] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:17.821 [2024-07-15 20:29:10.109388] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:17.821 [2024-07-15 20:29:10.109398] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:17.821 [2024-07-15 20:29:10.109410] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.821 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.822 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.081 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.081 "name": "Existed_Raid", 00:15:18.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.081 "strip_size_kb": 64, 00:15:18.081 "state": "configuring", 00:15:18.081 "raid_level": "concat", 00:15:18.081 "superblock": false, 00:15:18.081 "num_base_bdevs": 3, 00:15:18.081 "num_base_bdevs_discovered": 1, 00:15:18.081 "num_base_bdevs_operational": 3, 00:15:18.081 "base_bdevs_list": [ 00:15:18.081 { 00:15:18.081 "name": "BaseBdev1", 00:15:18.081 "uuid": "15bde8f0-6000-42b5-91b6-ef51e1b7f31b", 00:15:18.081 "is_configured": true, 00:15:18.081 "data_offset": 0, 00:15:18.081 "data_size": 65536 00:15:18.081 }, 00:15:18.081 { 00:15:18.081 "name": "BaseBdev2", 00:15:18.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.081 "is_configured": false, 00:15:18.081 "data_offset": 0, 00:15:18.081 "data_size": 0 00:15:18.081 }, 00:15:18.081 { 00:15:18.081 "name": "BaseBdev3", 00:15:18.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.081 "is_configured": false, 00:15:18.081 "data_offset": 0, 00:15:18.081 "data_size": 0 00:15:18.081 } 00:15:18.081 ] 00:15:18.081 }' 00:15:18.081 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.081 20:29:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.647 20:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:18.905 [2024-07-15 20:29:11.219402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:18.905 BaseBdev2 00:15:18.905 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:18.905 20:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:18.905 20:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:18.905 20:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:18.905 20:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:18.905 20:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:18.905 20:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:19.196 20:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:19.455 [ 00:15:19.455 { 00:15:19.455 "name": "BaseBdev2", 00:15:19.455 "aliases": [ 00:15:19.455 "67898541-117c-449b-98f5-224d68e9016a" 00:15:19.455 ], 00:15:19.455 "product_name": "Malloc disk", 00:15:19.455 "block_size": 512, 00:15:19.455 "num_blocks": 65536, 00:15:19.455 "uuid": "67898541-117c-449b-98f5-224d68e9016a", 00:15:19.455 "assigned_rate_limits": { 00:15:19.455 "rw_ios_per_sec": 0, 00:15:19.455 "rw_mbytes_per_sec": 0, 00:15:19.455 "r_mbytes_per_sec": 0, 00:15:19.455 "w_mbytes_per_sec": 0 00:15:19.455 }, 00:15:19.455 "claimed": true, 00:15:19.455 "claim_type": "exclusive_write", 00:15:19.455 "zoned": false, 00:15:19.455 "supported_io_types": { 00:15:19.455 "read": true, 00:15:19.455 "write": true, 00:15:19.455 "unmap": true, 00:15:19.455 "flush": true, 00:15:19.455 "reset": true, 00:15:19.455 "nvme_admin": false, 00:15:19.455 "nvme_io": false, 00:15:19.455 "nvme_io_md": false, 00:15:19.455 "write_zeroes": true, 00:15:19.455 "zcopy": true, 00:15:19.455 "get_zone_info": false, 00:15:19.455 "zone_management": false, 00:15:19.455 "zone_append": false, 00:15:19.455 "compare": false, 00:15:19.455 "compare_and_write": false, 00:15:19.455 "abort": true, 00:15:19.455 "seek_hole": false, 00:15:19.455 "seek_data": false, 00:15:19.455 "copy": true, 00:15:19.455 "nvme_iov_md": false 00:15:19.455 }, 00:15:19.455 "memory_domains": [ 00:15:19.455 { 00:15:19.455 "dma_device_id": "system", 00:15:19.455 "dma_device_type": 1 00:15:19.455 }, 00:15:19.455 { 00:15:19.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.455 "dma_device_type": 2 00:15:19.455 } 00:15:19.455 ], 00:15:19.455 "driver_specific": {} 00:15:19.455 } 00:15:19.455 ] 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.455 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.713 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.713 "name": "Existed_Raid", 00:15:19.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.713 "strip_size_kb": 64, 00:15:19.713 "state": "configuring", 00:15:19.713 "raid_level": "concat", 00:15:19.713 "superblock": false, 00:15:19.713 "num_base_bdevs": 3, 00:15:19.713 "num_base_bdevs_discovered": 2, 00:15:19.713 "num_base_bdevs_operational": 3, 00:15:19.713 "base_bdevs_list": [ 00:15:19.713 { 00:15:19.713 "name": "BaseBdev1", 00:15:19.713 "uuid": "15bde8f0-6000-42b5-91b6-ef51e1b7f31b", 00:15:19.713 "is_configured": true, 00:15:19.713 "data_offset": 0, 00:15:19.713 "data_size": 65536 00:15:19.713 }, 00:15:19.713 { 00:15:19.713 "name": "BaseBdev2", 00:15:19.713 "uuid": "67898541-117c-449b-98f5-224d68e9016a", 00:15:19.713 "is_configured": true, 00:15:19.713 "data_offset": 0, 00:15:19.713 "data_size": 65536 00:15:19.713 }, 00:15:19.713 { 00:15:19.713 "name": "BaseBdev3", 00:15:19.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.713 "is_configured": false, 00:15:19.713 "data_offset": 0, 00:15:19.713 "data_size": 0 00:15:19.713 } 00:15:19.714 ] 00:15:19.714 }' 00:15:19.714 20:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.714 20:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.281 20:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:20.539 [2024-07-15 20:29:12.790993] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:20.539 [2024-07-15 20:29:12.791029] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf42400 00:15:20.539 [2024-07-15 20:29:12.791038] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:20.539 [2024-07-15 20:29:12.791281] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf41ef0 00:15:20.539 [2024-07-15 20:29:12.791399] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf42400 00:15:20.539 [2024-07-15 20:29:12.791409] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf42400 00:15:20.539 [2024-07-15 20:29:12.791568] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:20.539 BaseBdev3 00:15:20.539 20:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:20.539 20:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:20.539 20:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:20.539 20:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:20.539 20:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:20.540 20:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:20.540 20:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:20.798 20:29:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:21.061 [ 00:15:21.061 { 00:15:21.061 "name": "BaseBdev3", 00:15:21.061 "aliases": [ 00:15:21.061 "44bcbd02-a697-43c5-aab6-a0b15a0d7ae8" 00:15:21.061 ], 00:15:21.061 "product_name": "Malloc disk", 00:15:21.061 "block_size": 512, 00:15:21.061 "num_blocks": 65536, 00:15:21.061 "uuid": "44bcbd02-a697-43c5-aab6-a0b15a0d7ae8", 00:15:21.061 "assigned_rate_limits": { 00:15:21.061 "rw_ios_per_sec": 0, 00:15:21.061 "rw_mbytes_per_sec": 0, 00:15:21.061 "r_mbytes_per_sec": 0, 00:15:21.061 "w_mbytes_per_sec": 0 00:15:21.061 }, 00:15:21.061 "claimed": true, 00:15:21.061 "claim_type": "exclusive_write", 00:15:21.061 "zoned": false, 00:15:21.061 "supported_io_types": { 00:15:21.061 "read": true, 00:15:21.061 "write": true, 00:15:21.061 "unmap": true, 00:15:21.061 "flush": true, 00:15:21.061 "reset": true, 00:15:21.061 "nvme_admin": false, 00:15:21.061 "nvme_io": false, 00:15:21.061 "nvme_io_md": false, 00:15:21.061 "write_zeroes": true, 00:15:21.061 "zcopy": true, 00:15:21.061 "get_zone_info": false, 00:15:21.061 "zone_management": false, 00:15:21.061 "zone_append": false, 00:15:21.061 "compare": false, 00:15:21.061 "compare_and_write": false, 00:15:21.061 "abort": true, 00:15:21.061 "seek_hole": false, 00:15:21.061 "seek_data": false, 00:15:21.061 "copy": true, 00:15:21.061 "nvme_iov_md": false 00:15:21.061 }, 00:15:21.061 "memory_domains": [ 00:15:21.061 { 00:15:21.061 "dma_device_id": "system", 00:15:21.061 "dma_device_type": 1 00:15:21.061 }, 00:15:21.061 { 00:15:21.061 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.061 "dma_device_type": 2 00:15:21.061 } 00:15:21.061 ], 00:15:21.061 "driver_specific": {} 00:15:21.061 } 00:15:21.061 ] 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.061 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.321 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.321 "name": "Existed_Raid", 00:15:21.321 "uuid": "e5c407eb-0cc4-44af-b715-8e486a405eb0", 00:15:21.321 "strip_size_kb": 64, 00:15:21.321 "state": "online", 00:15:21.321 "raid_level": "concat", 00:15:21.321 "superblock": false, 00:15:21.321 "num_base_bdevs": 3, 00:15:21.321 "num_base_bdevs_discovered": 3, 00:15:21.321 "num_base_bdevs_operational": 3, 00:15:21.321 "base_bdevs_list": [ 00:15:21.321 { 00:15:21.321 "name": "BaseBdev1", 00:15:21.321 "uuid": "15bde8f0-6000-42b5-91b6-ef51e1b7f31b", 00:15:21.321 "is_configured": true, 00:15:21.321 "data_offset": 0, 00:15:21.321 "data_size": 65536 00:15:21.321 }, 00:15:21.321 { 00:15:21.321 "name": "BaseBdev2", 00:15:21.321 "uuid": "67898541-117c-449b-98f5-224d68e9016a", 00:15:21.321 "is_configured": true, 00:15:21.321 "data_offset": 0, 00:15:21.321 "data_size": 65536 00:15:21.321 }, 00:15:21.321 { 00:15:21.321 "name": "BaseBdev3", 00:15:21.321 "uuid": "44bcbd02-a697-43c5-aab6-a0b15a0d7ae8", 00:15:21.321 "is_configured": true, 00:15:21.321 "data_offset": 0, 00:15:21.321 "data_size": 65536 00:15:21.321 } 00:15:21.321 ] 00:15:21.321 }' 00:15:21.321 20:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.321 20:29:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:21.888 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:21.888 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:21.888 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:21.888 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:21.888 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:21.888 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:21.888 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:21.888 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:22.180 [2024-07-15 20:29:14.359465] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:22.180 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:22.180 "name": "Existed_Raid", 00:15:22.180 "aliases": [ 00:15:22.181 "e5c407eb-0cc4-44af-b715-8e486a405eb0" 00:15:22.181 ], 00:15:22.181 "product_name": "Raid Volume", 00:15:22.181 "block_size": 512, 00:15:22.181 "num_blocks": 196608, 00:15:22.181 "uuid": "e5c407eb-0cc4-44af-b715-8e486a405eb0", 00:15:22.181 "assigned_rate_limits": { 00:15:22.181 "rw_ios_per_sec": 0, 00:15:22.181 "rw_mbytes_per_sec": 0, 00:15:22.181 "r_mbytes_per_sec": 0, 00:15:22.181 "w_mbytes_per_sec": 0 00:15:22.181 }, 00:15:22.181 "claimed": false, 00:15:22.181 "zoned": false, 00:15:22.181 "supported_io_types": { 00:15:22.181 "read": true, 00:15:22.181 "write": true, 00:15:22.181 "unmap": true, 00:15:22.181 "flush": true, 00:15:22.181 "reset": true, 00:15:22.181 "nvme_admin": false, 00:15:22.181 "nvme_io": false, 00:15:22.181 "nvme_io_md": false, 00:15:22.181 "write_zeroes": true, 00:15:22.181 "zcopy": false, 00:15:22.181 "get_zone_info": false, 00:15:22.181 "zone_management": false, 00:15:22.181 "zone_append": false, 00:15:22.181 "compare": false, 00:15:22.181 "compare_and_write": false, 00:15:22.181 "abort": false, 00:15:22.181 "seek_hole": false, 00:15:22.181 "seek_data": false, 00:15:22.181 "copy": false, 00:15:22.181 "nvme_iov_md": false 00:15:22.181 }, 00:15:22.181 "memory_domains": [ 00:15:22.181 { 00:15:22.181 "dma_device_id": "system", 00:15:22.181 "dma_device_type": 1 00:15:22.181 }, 00:15:22.181 { 00:15:22.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.181 "dma_device_type": 2 00:15:22.181 }, 00:15:22.181 { 00:15:22.181 "dma_device_id": "system", 00:15:22.181 "dma_device_type": 1 00:15:22.181 }, 00:15:22.181 { 00:15:22.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.181 "dma_device_type": 2 00:15:22.181 }, 00:15:22.181 { 00:15:22.181 "dma_device_id": "system", 00:15:22.181 "dma_device_type": 1 00:15:22.181 }, 00:15:22.181 { 00:15:22.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.181 "dma_device_type": 2 00:15:22.181 } 00:15:22.181 ], 00:15:22.181 "driver_specific": { 00:15:22.181 "raid": { 00:15:22.181 "uuid": "e5c407eb-0cc4-44af-b715-8e486a405eb0", 00:15:22.181 "strip_size_kb": 64, 00:15:22.181 "state": "online", 00:15:22.181 "raid_level": "concat", 00:15:22.181 "superblock": false, 00:15:22.181 "num_base_bdevs": 3, 00:15:22.181 "num_base_bdevs_discovered": 3, 00:15:22.181 "num_base_bdevs_operational": 3, 00:15:22.181 "base_bdevs_list": [ 00:15:22.181 { 00:15:22.181 "name": "BaseBdev1", 00:15:22.181 "uuid": "15bde8f0-6000-42b5-91b6-ef51e1b7f31b", 00:15:22.181 "is_configured": true, 00:15:22.181 "data_offset": 0, 00:15:22.181 "data_size": 65536 00:15:22.181 }, 00:15:22.181 { 00:15:22.181 "name": "BaseBdev2", 00:15:22.181 "uuid": "67898541-117c-449b-98f5-224d68e9016a", 00:15:22.181 "is_configured": true, 00:15:22.181 "data_offset": 0, 00:15:22.181 "data_size": 65536 00:15:22.181 }, 00:15:22.181 { 00:15:22.181 "name": "BaseBdev3", 00:15:22.181 "uuid": "44bcbd02-a697-43c5-aab6-a0b15a0d7ae8", 00:15:22.181 "is_configured": true, 00:15:22.181 "data_offset": 0, 00:15:22.181 "data_size": 65536 00:15:22.181 } 00:15:22.181 ] 00:15:22.181 } 00:15:22.181 } 00:15:22.181 }' 00:15:22.181 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:22.181 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:22.181 BaseBdev2 00:15:22.181 BaseBdev3' 00:15:22.181 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:22.181 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:22.181 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:22.440 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:22.440 "name": "BaseBdev1", 00:15:22.440 "aliases": [ 00:15:22.440 "15bde8f0-6000-42b5-91b6-ef51e1b7f31b" 00:15:22.440 ], 00:15:22.440 "product_name": "Malloc disk", 00:15:22.440 "block_size": 512, 00:15:22.440 "num_blocks": 65536, 00:15:22.440 "uuid": "15bde8f0-6000-42b5-91b6-ef51e1b7f31b", 00:15:22.440 "assigned_rate_limits": { 00:15:22.440 "rw_ios_per_sec": 0, 00:15:22.440 "rw_mbytes_per_sec": 0, 00:15:22.440 "r_mbytes_per_sec": 0, 00:15:22.440 "w_mbytes_per_sec": 0 00:15:22.440 }, 00:15:22.440 "claimed": true, 00:15:22.440 "claim_type": "exclusive_write", 00:15:22.440 "zoned": false, 00:15:22.440 "supported_io_types": { 00:15:22.441 "read": true, 00:15:22.441 "write": true, 00:15:22.441 "unmap": true, 00:15:22.441 "flush": true, 00:15:22.441 "reset": true, 00:15:22.441 "nvme_admin": false, 00:15:22.441 "nvme_io": false, 00:15:22.441 "nvme_io_md": false, 00:15:22.441 "write_zeroes": true, 00:15:22.441 "zcopy": true, 00:15:22.441 "get_zone_info": false, 00:15:22.441 "zone_management": false, 00:15:22.441 "zone_append": false, 00:15:22.441 "compare": false, 00:15:22.441 "compare_and_write": false, 00:15:22.441 "abort": true, 00:15:22.441 "seek_hole": false, 00:15:22.441 "seek_data": false, 00:15:22.441 "copy": true, 00:15:22.441 "nvme_iov_md": false 00:15:22.441 }, 00:15:22.441 "memory_domains": [ 00:15:22.441 { 00:15:22.441 "dma_device_id": "system", 00:15:22.441 "dma_device_type": 1 00:15:22.441 }, 00:15:22.441 { 00:15:22.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.441 "dma_device_type": 2 00:15:22.441 } 00:15:22.441 ], 00:15:22.441 "driver_specific": {} 00:15:22.441 }' 00:15:22.441 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.441 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.441 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:22.441 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.441 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.700 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:22.700 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:22.700 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:22.700 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:22.700 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.700 20:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.700 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:22.700 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:22.700 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:22.700 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:22.960 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:22.960 "name": "BaseBdev2", 00:15:22.960 "aliases": [ 00:15:22.960 "67898541-117c-449b-98f5-224d68e9016a" 00:15:22.960 ], 00:15:22.960 "product_name": "Malloc disk", 00:15:22.960 "block_size": 512, 00:15:22.960 "num_blocks": 65536, 00:15:22.960 "uuid": "67898541-117c-449b-98f5-224d68e9016a", 00:15:22.960 "assigned_rate_limits": { 00:15:22.960 "rw_ios_per_sec": 0, 00:15:22.960 "rw_mbytes_per_sec": 0, 00:15:22.960 "r_mbytes_per_sec": 0, 00:15:22.960 "w_mbytes_per_sec": 0 00:15:22.960 }, 00:15:22.960 "claimed": true, 00:15:22.960 "claim_type": "exclusive_write", 00:15:22.960 "zoned": false, 00:15:22.960 "supported_io_types": { 00:15:22.960 "read": true, 00:15:22.960 "write": true, 00:15:22.960 "unmap": true, 00:15:22.960 "flush": true, 00:15:22.960 "reset": true, 00:15:22.960 "nvme_admin": false, 00:15:22.960 "nvme_io": false, 00:15:22.960 "nvme_io_md": false, 00:15:22.960 "write_zeroes": true, 00:15:22.960 "zcopy": true, 00:15:22.960 "get_zone_info": false, 00:15:22.960 "zone_management": false, 00:15:22.960 "zone_append": false, 00:15:22.960 "compare": false, 00:15:22.960 "compare_and_write": false, 00:15:22.960 "abort": true, 00:15:22.960 "seek_hole": false, 00:15:22.960 "seek_data": false, 00:15:22.960 "copy": true, 00:15:22.960 "nvme_iov_md": false 00:15:22.960 }, 00:15:22.960 "memory_domains": [ 00:15:22.960 { 00:15:22.960 "dma_device_id": "system", 00:15:22.960 "dma_device_type": 1 00:15:22.960 }, 00:15:22.960 { 00:15:22.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.960 "dma_device_type": 2 00:15:22.960 } 00:15:22.960 ], 00:15:22.960 "driver_specific": {} 00:15:22.960 }' 00:15:22.960 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.960 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:23.218 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:23.218 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:23.218 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:23.218 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:23.218 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.218 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.218 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:23.218 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.477 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.477 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:23.477 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:23.478 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:23.478 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:23.737 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:23.737 "name": "BaseBdev3", 00:15:23.737 "aliases": [ 00:15:23.737 "44bcbd02-a697-43c5-aab6-a0b15a0d7ae8" 00:15:23.737 ], 00:15:23.737 "product_name": "Malloc disk", 00:15:23.737 "block_size": 512, 00:15:23.737 "num_blocks": 65536, 00:15:23.737 "uuid": "44bcbd02-a697-43c5-aab6-a0b15a0d7ae8", 00:15:23.737 "assigned_rate_limits": { 00:15:23.737 "rw_ios_per_sec": 0, 00:15:23.737 "rw_mbytes_per_sec": 0, 00:15:23.737 "r_mbytes_per_sec": 0, 00:15:23.737 "w_mbytes_per_sec": 0 00:15:23.737 }, 00:15:23.737 "claimed": true, 00:15:23.737 "claim_type": "exclusive_write", 00:15:23.737 "zoned": false, 00:15:23.737 "supported_io_types": { 00:15:23.737 "read": true, 00:15:23.737 "write": true, 00:15:23.737 "unmap": true, 00:15:23.737 "flush": true, 00:15:23.737 "reset": true, 00:15:23.737 "nvme_admin": false, 00:15:23.737 "nvme_io": false, 00:15:23.737 "nvme_io_md": false, 00:15:23.737 "write_zeroes": true, 00:15:23.737 "zcopy": true, 00:15:23.737 "get_zone_info": false, 00:15:23.737 "zone_management": false, 00:15:23.737 "zone_append": false, 00:15:23.737 "compare": false, 00:15:23.737 "compare_and_write": false, 00:15:23.737 "abort": true, 00:15:23.737 "seek_hole": false, 00:15:23.737 "seek_data": false, 00:15:23.737 "copy": true, 00:15:23.737 "nvme_iov_md": false 00:15:23.737 }, 00:15:23.737 "memory_domains": [ 00:15:23.737 { 00:15:23.737 "dma_device_id": "system", 00:15:23.737 "dma_device_type": 1 00:15:23.737 }, 00:15:23.737 { 00:15:23.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.737 "dma_device_type": 2 00:15:23.737 } 00:15:23.737 ], 00:15:23.737 "driver_specific": {} 00:15:23.737 }' 00:15:23.737 20:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:23.737 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:23.737 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:23.737 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:23.996 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:23.996 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:23.996 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.996 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.996 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:23.996 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.996 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.996 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:23.996 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:24.255 [2024-07-15 20:29:16.573120] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:24.255 [2024-07-15 20:29:16.573149] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:24.255 [2024-07-15 20:29:16.573188] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.255 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.513 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.513 "name": "Existed_Raid", 00:15:24.513 "uuid": "e5c407eb-0cc4-44af-b715-8e486a405eb0", 00:15:24.513 "strip_size_kb": 64, 00:15:24.513 "state": "offline", 00:15:24.513 "raid_level": "concat", 00:15:24.513 "superblock": false, 00:15:24.513 "num_base_bdevs": 3, 00:15:24.513 "num_base_bdevs_discovered": 2, 00:15:24.513 "num_base_bdevs_operational": 2, 00:15:24.513 "base_bdevs_list": [ 00:15:24.513 { 00:15:24.513 "name": null, 00:15:24.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:24.513 "is_configured": false, 00:15:24.513 "data_offset": 0, 00:15:24.513 "data_size": 65536 00:15:24.513 }, 00:15:24.513 { 00:15:24.513 "name": "BaseBdev2", 00:15:24.513 "uuid": "67898541-117c-449b-98f5-224d68e9016a", 00:15:24.513 "is_configured": true, 00:15:24.513 "data_offset": 0, 00:15:24.513 "data_size": 65536 00:15:24.513 }, 00:15:24.513 { 00:15:24.513 "name": "BaseBdev3", 00:15:24.513 "uuid": "44bcbd02-a697-43c5-aab6-a0b15a0d7ae8", 00:15:24.513 "is_configured": true, 00:15:24.513 "data_offset": 0, 00:15:24.513 "data_size": 65536 00:15:24.513 } 00:15:24.513 ] 00:15:24.513 }' 00:15:24.513 20:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.513 20:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:25.496 20:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:25.496 20:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:25.496 20:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.496 20:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:26.064 20:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:26.064 20:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:26.064 20:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:26.630 [2024-07-15 20:29:18.735887] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:26.630 20:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:26.630 20:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:26.630 20:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.630 20:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:26.889 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:26.889 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:26.889 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:27.148 [2024-07-15 20:29:19.510330] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:27.148 [2024-07-15 20:29:19.510373] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf42400 name Existed_Raid, state offline 00:15:27.407 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:27.407 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:27.407 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.407 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:27.667 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:27.667 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:27.667 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:27.667 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:27.667 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:27.667 20:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:27.926 BaseBdev2 00:15:28.185 20:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:28.185 20:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:28.185 20:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:28.185 20:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:28.185 20:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:28.185 20:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:28.185 20:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:28.753 20:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:28.753 [ 00:15:28.753 { 00:15:28.753 "name": "BaseBdev2", 00:15:28.753 "aliases": [ 00:15:28.753 "818e1541-d282-4ede-97b5-7cf13d9dab95" 00:15:28.753 ], 00:15:28.753 "product_name": "Malloc disk", 00:15:28.753 "block_size": 512, 00:15:28.753 "num_blocks": 65536, 00:15:28.753 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:28.753 "assigned_rate_limits": { 00:15:28.753 "rw_ios_per_sec": 0, 00:15:28.753 "rw_mbytes_per_sec": 0, 00:15:28.753 "r_mbytes_per_sec": 0, 00:15:28.753 "w_mbytes_per_sec": 0 00:15:28.753 }, 00:15:28.753 "claimed": false, 00:15:28.753 "zoned": false, 00:15:28.753 "supported_io_types": { 00:15:28.753 "read": true, 00:15:28.753 "write": true, 00:15:28.753 "unmap": true, 00:15:28.753 "flush": true, 00:15:28.753 "reset": true, 00:15:28.753 "nvme_admin": false, 00:15:28.753 "nvme_io": false, 00:15:28.753 "nvme_io_md": false, 00:15:28.753 "write_zeroes": true, 00:15:28.753 "zcopy": true, 00:15:28.753 "get_zone_info": false, 00:15:28.753 "zone_management": false, 00:15:28.753 "zone_append": false, 00:15:28.753 "compare": false, 00:15:28.753 "compare_and_write": false, 00:15:28.753 "abort": true, 00:15:28.753 "seek_hole": false, 00:15:28.753 "seek_data": false, 00:15:28.753 "copy": true, 00:15:28.753 "nvme_iov_md": false 00:15:28.753 }, 00:15:28.753 "memory_domains": [ 00:15:28.753 { 00:15:28.753 "dma_device_id": "system", 00:15:28.753 "dma_device_type": 1 00:15:28.753 }, 00:15:28.753 { 00:15:28.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.753 "dma_device_type": 2 00:15:28.753 } 00:15:28.753 ], 00:15:28.753 "driver_specific": {} 00:15:28.753 } 00:15:28.753 ] 00:15:28.753 20:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:28.753 20:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:28.753 20:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:28.753 20:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:29.320 BaseBdev3 00:15:29.320 20:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:29.320 20:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:29.320 20:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:29.321 20:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:29.321 20:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:29.321 20:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:29.321 20:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:29.889 20:29:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:30.470 [ 00:15:30.470 { 00:15:30.470 "name": "BaseBdev3", 00:15:30.470 "aliases": [ 00:15:30.470 "7299f911-5d96-4268-9c70-c473d9f2d0f9" 00:15:30.470 ], 00:15:30.470 "product_name": "Malloc disk", 00:15:30.470 "block_size": 512, 00:15:30.470 "num_blocks": 65536, 00:15:30.470 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:30.470 "assigned_rate_limits": { 00:15:30.470 "rw_ios_per_sec": 0, 00:15:30.470 "rw_mbytes_per_sec": 0, 00:15:30.470 "r_mbytes_per_sec": 0, 00:15:30.470 "w_mbytes_per_sec": 0 00:15:30.470 }, 00:15:30.470 "claimed": false, 00:15:30.470 "zoned": false, 00:15:30.470 "supported_io_types": { 00:15:30.470 "read": true, 00:15:30.470 "write": true, 00:15:30.470 "unmap": true, 00:15:30.470 "flush": true, 00:15:30.470 "reset": true, 00:15:30.470 "nvme_admin": false, 00:15:30.470 "nvme_io": false, 00:15:30.470 "nvme_io_md": false, 00:15:30.470 "write_zeroes": true, 00:15:30.470 "zcopy": true, 00:15:30.470 "get_zone_info": false, 00:15:30.470 "zone_management": false, 00:15:30.470 "zone_append": false, 00:15:30.470 "compare": false, 00:15:30.470 "compare_and_write": false, 00:15:30.470 "abort": true, 00:15:30.470 "seek_hole": false, 00:15:30.470 "seek_data": false, 00:15:30.470 "copy": true, 00:15:30.470 "nvme_iov_md": false 00:15:30.470 }, 00:15:30.470 "memory_domains": [ 00:15:30.470 { 00:15:30.470 "dma_device_id": "system", 00:15:30.470 "dma_device_type": 1 00:15:30.470 }, 00:15:30.470 { 00:15:30.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.470 "dma_device_type": 2 00:15:30.470 } 00:15:30.470 ], 00:15:30.470 "driver_specific": {} 00:15:30.470 } 00:15:30.470 ] 00:15:30.470 20:29:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:30.470 20:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:30.470 20:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:30.470 20:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:30.729 [2024-07-15 20:29:23.107371] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:30.729 [2024-07-15 20:29:23.107411] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:30.729 [2024-07-15 20:29:23.107431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:30.988 [2024-07-15 20:29:23.108756] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.988 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.247 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.247 "name": "Existed_Raid", 00:15:31.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.247 "strip_size_kb": 64, 00:15:31.247 "state": "configuring", 00:15:31.247 "raid_level": "concat", 00:15:31.247 "superblock": false, 00:15:31.247 "num_base_bdevs": 3, 00:15:31.247 "num_base_bdevs_discovered": 2, 00:15:31.247 "num_base_bdevs_operational": 3, 00:15:31.247 "base_bdevs_list": [ 00:15:31.247 { 00:15:31.247 "name": "BaseBdev1", 00:15:31.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.247 "is_configured": false, 00:15:31.247 "data_offset": 0, 00:15:31.247 "data_size": 0 00:15:31.247 }, 00:15:31.247 { 00:15:31.247 "name": "BaseBdev2", 00:15:31.247 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:31.247 "is_configured": true, 00:15:31.247 "data_offset": 0, 00:15:31.247 "data_size": 65536 00:15:31.247 }, 00:15:31.247 { 00:15:31.247 "name": "BaseBdev3", 00:15:31.248 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:31.248 "is_configured": true, 00:15:31.248 "data_offset": 0, 00:15:31.248 "data_size": 65536 00:15:31.248 } 00:15:31.248 ] 00:15:31.248 }' 00:15:31.248 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.248 20:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.814 20:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:32.073 [2024-07-15 20:29:24.222304] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.073 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.331 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.331 "name": "Existed_Raid", 00:15:32.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.331 "strip_size_kb": 64, 00:15:32.331 "state": "configuring", 00:15:32.331 "raid_level": "concat", 00:15:32.331 "superblock": false, 00:15:32.331 "num_base_bdevs": 3, 00:15:32.331 "num_base_bdevs_discovered": 1, 00:15:32.331 "num_base_bdevs_operational": 3, 00:15:32.331 "base_bdevs_list": [ 00:15:32.331 { 00:15:32.331 "name": "BaseBdev1", 00:15:32.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.331 "is_configured": false, 00:15:32.331 "data_offset": 0, 00:15:32.331 "data_size": 0 00:15:32.331 }, 00:15:32.331 { 00:15:32.331 "name": null, 00:15:32.331 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:32.331 "is_configured": false, 00:15:32.331 "data_offset": 0, 00:15:32.331 "data_size": 65536 00:15:32.331 }, 00:15:32.331 { 00:15:32.331 "name": "BaseBdev3", 00:15:32.331 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:32.331 "is_configured": true, 00:15:32.331 "data_offset": 0, 00:15:32.331 "data_size": 65536 00:15:32.331 } 00:15:32.331 ] 00:15:32.331 }' 00:15:32.331 20:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.331 20:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:32.898 20:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:32.898 20:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.157 20:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:33.157 20:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:33.722 [2024-07-15 20:29:25.843180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:33.722 BaseBdev1 00:15:33.722 20:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:33.722 20:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:33.722 20:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:33.722 20:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:33.722 20:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:33.722 20:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:33.723 20:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:34.289 20:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:34.556 [ 00:15:34.557 { 00:15:34.557 "name": "BaseBdev1", 00:15:34.557 "aliases": [ 00:15:34.557 "b7504448-9181-4e00-911e-f255156649e2" 00:15:34.557 ], 00:15:34.557 "product_name": "Malloc disk", 00:15:34.557 "block_size": 512, 00:15:34.557 "num_blocks": 65536, 00:15:34.557 "uuid": "b7504448-9181-4e00-911e-f255156649e2", 00:15:34.557 "assigned_rate_limits": { 00:15:34.557 "rw_ios_per_sec": 0, 00:15:34.557 "rw_mbytes_per_sec": 0, 00:15:34.557 "r_mbytes_per_sec": 0, 00:15:34.557 "w_mbytes_per_sec": 0 00:15:34.557 }, 00:15:34.557 "claimed": true, 00:15:34.557 "claim_type": "exclusive_write", 00:15:34.557 "zoned": false, 00:15:34.557 "supported_io_types": { 00:15:34.557 "read": true, 00:15:34.557 "write": true, 00:15:34.557 "unmap": true, 00:15:34.557 "flush": true, 00:15:34.557 "reset": true, 00:15:34.557 "nvme_admin": false, 00:15:34.557 "nvme_io": false, 00:15:34.557 "nvme_io_md": false, 00:15:34.557 "write_zeroes": true, 00:15:34.557 "zcopy": true, 00:15:34.557 "get_zone_info": false, 00:15:34.557 "zone_management": false, 00:15:34.557 "zone_append": false, 00:15:34.557 "compare": false, 00:15:34.557 "compare_and_write": false, 00:15:34.557 "abort": true, 00:15:34.557 "seek_hole": false, 00:15:34.557 "seek_data": false, 00:15:34.557 "copy": true, 00:15:34.557 "nvme_iov_md": false 00:15:34.557 }, 00:15:34.557 "memory_domains": [ 00:15:34.557 { 00:15:34.557 "dma_device_id": "system", 00:15:34.557 "dma_device_type": 1 00:15:34.557 }, 00:15:34.557 { 00:15:34.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.557 "dma_device_type": 2 00:15:34.557 } 00:15:34.557 ], 00:15:34.557 "driver_specific": {} 00:15:34.557 } 00:15:34.557 ] 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.557 20:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.819 20:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.819 "name": "Existed_Raid", 00:15:34.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.819 "strip_size_kb": 64, 00:15:34.819 "state": "configuring", 00:15:34.819 "raid_level": "concat", 00:15:34.819 "superblock": false, 00:15:34.819 "num_base_bdevs": 3, 00:15:34.819 "num_base_bdevs_discovered": 2, 00:15:34.819 "num_base_bdevs_operational": 3, 00:15:34.819 "base_bdevs_list": [ 00:15:34.819 { 00:15:34.819 "name": "BaseBdev1", 00:15:34.819 "uuid": "b7504448-9181-4e00-911e-f255156649e2", 00:15:34.819 "is_configured": true, 00:15:34.819 "data_offset": 0, 00:15:34.819 "data_size": 65536 00:15:34.819 }, 00:15:34.819 { 00:15:34.819 "name": null, 00:15:34.819 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:34.819 "is_configured": false, 00:15:34.819 "data_offset": 0, 00:15:34.819 "data_size": 65536 00:15:34.819 }, 00:15:34.819 { 00:15:34.819 "name": "BaseBdev3", 00:15:34.819 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:34.819 "is_configured": true, 00:15:34.819 "data_offset": 0, 00:15:34.819 "data_size": 65536 00:15:34.819 } 00:15:34.819 ] 00:15:34.819 }' 00:15:34.819 20:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.819 20:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.754 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.754 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:36.014 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:36.014 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:36.580 [2024-07-15 20:29:28.762983] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.580 20:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:36.838 20:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.838 "name": "Existed_Raid", 00:15:36.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.838 "strip_size_kb": 64, 00:15:36.838 "state": "configuring", 00:15:36.838 "raid_level": "concat", 00:15:36.838 "superblock": false, 00:15:36.838 "num_base_bdevs": 3, 00:15:36.838 "num_base_bdevs_discovered": 1, 00:15:36.838 "num_base_bdevs_operational": 3, 00:15:36.838 "base_bdevs_list": [ 00:15:36.838 { 00:15:36.838 "name": "BaseBdev1", 00:15:36.838 "uuid": "b7504448-9181-4e00-911e-f255156649e2", 00:15:36.838 "is_configured": true, 00:15:36.838 "data_offset": 0, 00:15:36.838 "data_size": 65536 00:15:36.838 }, 00:15:36.838 { 00:15:36.838 "name": null, 00:15:36.838 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:36.838 "is_configured": false, 00:15:36.838 "data_offset": 0, 00:15:36.838 "data_size": 65536 00:15:36.838 }, 00:15:36.838 { 00:15:36.838 "name": null, 00:15:36.838 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:36.838 "is_configured": false, 00:15:36.838 "data_offset": 0, 00:15:36.838 "data_size": 65536 00:15:36.838 } 00:15:36.838 ] 00:15:36.838 }' 00:15:36.838 20:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.838 20:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.405 20:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.405 20:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:37.664 20:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:37.664 20:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:37.922 [2024-07-15 20:29:30.122608] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.922 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:38.181 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.181 "name": "Existed_Raid", 00:15:38.181 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:38.181 "strip_size_kb": 64, 00:15:38.181 "state": "configuring", 00:15:38.181 "raid_level": "concat", 00:15:38.181 "superblock": false, 00:15:38.181 "num_base_bdevs": 3, 00:15:38.181 "num_base_bdevs_discovered": 2, 00:15:38.181 "num_base_bdevs_operational": 3, 00:15:38.181 "base_bdevs_list": [ 00:15:38.181 { 00:15:38.181 "name": "BaseBdev1", 00:15:38.181 "uuid": "b7504448-9181-4e00-911e-f255156649e2", 00:15:38.181 "is_configured": true, 00:15:38.181 "data_offset": 0, 00:15:38.181 "data_size": 65536 00:15:38.181 }, 00:15:38.181 { 00:15:38.181 "name": null, 00:15:38.181 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:38.181 "is_configured": false, 00:15:38.181 "data_offset": 0, 00:15:38.181 "data_size": 65536 00:15:38.181 }, 00:15:38.181 { 00:15:38.181 "name": "BaseBdev3", 00:15:38.181 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:38.181 "is_configured": true, 00:15:38.181 "data_offset": 0, 00:15:38.181 "data_size": 65536 00:15:38.181 } 00:15:38.181 ] 00:15:38.181 }' 00:15:38.181 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.181 20:29:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.748 20:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.748 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:39.007 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:39.007 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:39.301 [2024-07-15 20:29:31.478191] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.301 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:39.560 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:39.560 "name": "Existed_Raid", 00:15:39.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:39.560 "strip_size_kb": 64, 00:15:39.560 "state": "configuring", 00:15:39.560 "raid_level": "concat", 00:15:39.560 "superblock": false, 00:15:39.560 "num_base_bdevs": 3, 00:15:39.560 "num_base_bdevs_discovered": 1, 00:15:39.560 "num_base_bdevs_operational": 3, 00:15:39.560 "base_bdevs_list": [ 00:15:39.560 { 00:15:39.560 "name": null, 00:15:39.560 "uuid": "b7504448-9181-4e00-911e-f255156649e2", 00:15:39.560 "is_configured": false, 00:15:39.560 "data_offset": 0, 00:15:39.560 "data_size": 65536 00:15:39.560 }, 00:15:39.560 { 00:15:39.560 "name": null, 00:15:39.560 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:39.560 "is_configured": false, 00:15:39.560 "data_offset": 0, 00:15:39.560 "data_size": 65536 00:15:39.560 }, 00:15:39.560 { 00:15:39.560 "name": "BaseBdev3", 00:15:39.560 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:39.560 "is_configured": true, 00:15:39.560 "data_offset": 0, 00:15:39.560 "data_size": 65536 00:15:39.560 } 00:15:39.560 ] 00:15:39.560 }' 00:15:39.560 20:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:39.560 20:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.497 20:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.497 20:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:40.755 20:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:40.755 20:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:41.321 [2024-07-15 20:29:33.402270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:41.321 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.322 "name": "Existed_Raid", 00:15:41.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:41.322 "strip_size_kb": 64, 00:15:41.322 "state": "configuring", 00:15:41.322 "raid_level": "concat", 00:15:41.322 "superblock": false, 00:15:41.322 "num_base_bdevs": 3, 00:15:41.322 "num_base_bdevs_discovered": 2, 00:15:41.322 "num_base_bdevs_operational": 3, 00:15:41.322 "base_bdevs_list": [ 00:15:41.322 { 00:15:41.322 "name": null, 00:15:41.322 "uuid": "b7504448-9181-4e00-911e-f255156649e2", 00:15:41.322 "is_configured": false, 00:15:41.322 "data_offset": 0, 00:15:41.322 "data_size": 65536 00:15:41.322 }, 00:15:41.322 { 00:15:41.322 "name": "BaseBdev2", 00:15:41.322 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:41.322 "is_configured": true, 00:15:41.322 "data_offset": 0, 00:15:41.322 "data_size": 65536 00:15:41.322 }, 00:15:41.322 { 00:15:41.322 "name": "BaseBdev3", 00:15:41.322 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:41.322 "is_configured": true, 00:15:41.322 "data_offset": 0, 00:15:41.322 "data_size": 65536 00:15:41.322 } 00:15:41.322 ] 00:15:41.322 }' 00:15:41.322 20:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.322 20:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.257 20:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.257 20:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:42.516 20:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:42.516 20:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.516 20:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:43.084 20:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b7504448-9181-4e00-911e-f255156649e2 00:15:43.651 [2024-07-15 20:29:35.849261] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:43.651 [2024-07-15 20:29:35.849300] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf40450 00:15:43.651 [2024-07-15 20:29:35.849309] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:43.651 [2024-07-15 20:29:35.849500] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf41ed0 00:15:43.651 [2024-07-15 20:29:35.849616] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf40450 00:15:43.651 [2024-07-15 20:29:35.849626] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf40450 00:15:43.651 [2024-07-15 20:29:35.849786] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:43.651 NewBaseBdev 00:15:43.651 20:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:43.651 20:29:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:43.651 20:29:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:43.651 20:29:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:43.651 20:29:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:43.651 20:29:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:43.651 20:29:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:44.219 20:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:44.786 [ 00:15:44.786 { 00:15:44.786 "name": "NewBaseBdev", 00:15:44.786 "aliases": [ 00:15:44.786 "b7504448-9181-4e00-911e-f255156649e2" 00:15:44.786 ], 00:15:44.786 "product_name": "Malloc disk", 00:15:44.786 "block_size": 512, 00:15:44.786 "num_blocks": 65536, 00:15:44.786 "uuid": "b7504448-9181-4e00-911e-f255156649e2", 00:15:44.786 "assigned_rate_limits": { 00:15:44.786 "rw_ios_per_sec": 0, 00:15:44.786 "rw_mbytes_per_sec": 0, 00:15:44.786 "r_mbytes_per_sec": 0, 00:15:44.786 "w_mbytes_per_sec": 0 00:15:44.786 }, 00:15:44.786 "claimed": true, 00:15:44.786 "claim_type": "exclusive_write", 00:15:44.786 "zoned": false, 00:15:44.786 "supported_io_types": { 00:15:44.786 "read": true, 00:15:44.786 "write": true, 00:15:44.786 "unmap": true, 00:15:44.786 "flush": true, 00:15:44.786 "reset": true, 00:15:44.786 "nvme_admin": false, 00:15:44.786 "nvme_io": false, 00:15:44.786 "nvme_io_md": false, 00:15:44.786 "write_zeroes": true, 00:15:44.786 "zcopy": true, 00:15:44.786 "get_zone_info": false, 00:15:44.786 "zone_management": false, 00:15:44.786 "zone_append": false, 00:15:44.786 "compare": false, 00:15:44.786 "compare_and_write": false, 00:15:44.786 "abort": true, 00:15:44.786 "seek_hole": false, 00:15:44.786 "seek_data": false, 00:15:44.786 "copy": true, 00:15:44.786 "nvme_iov_md": false 00:15:44.786 }, 00:15:44.786 "memory_domains": [ 00:15:44.786 { 00:15:44.786 "dma_device_id": "system", 00:15:44.786 "dma_device_type": 1 00:15:44.786 }, 00:15:44.786 { 00:15:44.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.786 "dma_device_type": 2 00:15:44.786 } 00:15:44.786 ], 00:15:44.786 "driver_specific": {} 00:15:44.786 } 00:15:44.786 ] 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.786 20:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.786 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.786 "name": "Existed_Raid", 00:15:44.786 "uuid": "e1148327-3a66-40ee-ba95-5ec8b7053b12", 00:15:44.786 "strip_size_kb": 64, 00:15:44.786 "state": "online", 00:15:44.786 "raid_level": "concat", 00:15:44.786 "superblock": false, 00:15:44.786 "num_base_bdevs": 3, 00:15:44.786 "num_base_bdevs_discovered": 3, 00:15:44.786 "num_base_bdevs_operational": 3, 00:15:44.786 "base_bdevs_list": [ 00:15:44.786 { 00:15:44.786 "name": "NewBaseBdev", 00:15:44.786 "uuid": "b7504448-9181-4e00-911e-f255156649e2", 00:15:44.786 "is_configured": true, 00:15:44.786 "data_offset": 0, 00:15:44.786 "data_size": 65536 00:15:44.786 }, 00:15:44.786 { 00:15:44.786 "name": "BaseBdev2", 00:15:44.786 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:44.786 "is_configured": true, 00:15:44.786 "data_offset": 0, 00:15:44.786 "data_size": 65536 00:15:44.786 }, 00:15:44.786 { 00:15:44.786 "name": "BaseBdev3", 00:15:44.786 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:44.786 "is_configured": true, 00:15:44.786 "data_offset": 0, 00:15:44.786 "data_size": 65536 00:15:44.786 } 00:15:44.786 ] 00:15:44.786 }' 00:15:44.786 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.786 20:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.352 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:45.352 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:45.352 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:45.352 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:45.352 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:45.352 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:45.352 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:45.352 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:45.610 [2024-07-15 20:29:37.919062] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:45.610 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:45.610 "name": "Existed_Raid", 00:15:45.610 "aliases": [ 00:15:45.610 "e1148327-3a66-40ee-ba95-5ec8b7053b12" 00:15:45.610 ], 00:15:45.610 "product_name": "Raid Volume", 00:15:45.610 "block_size": 512, 00:15:45.610 "num_blocks": 196608, 00:15:45.610 "uuid": "e1148327-3a66-40ee-ba95-5ec8b7053b12", 00:15:45.610 "assigned_rate_limits": { 00:15:45.610 "rw_ios_per_sec": 0, 00:15:45.610 "rw_mbytes_per_sec": 0, 00:15:45.610 "r_mbytes_per_sec": 0, 00:15:45.610 "w_mbytes_per_sec": 0 00:15:45.610 }, 00:15:45.610 "claimed": false, 00:15:45.610 "zoned": false, 00:15:45.610 "supported_io_types": { 00:15:45.610 "read": true, 00:15:45.610 "write": true, 00:15:45.610 "unmap": true, 00:15:45.610 "flush": true, 00:15:45.610 "reset": true, 00:15:45.610 "nvme_admin": false, 00:15:45.610 "nvme_io": false, 00:15:45.610 "nvme_io_md": false, 00:15:45.610 "write_zeroes": true, 00:15:45.610 "zcopy": false, 00:15:45.610 "get_zone_info": false, 00:15:45.610 "zone_management": false, 00:15:45.610 "zone_append": false, 00:15:45.610 "compare": false, 00:15:45.610 "compare_and_write": false, 00:15:45.610 "abort": false, 00:15:45.610 "seek_hole": false, 00:15:45.610 "seek_data": false, 00:15:45.610 "copy": false, 00:15:45.610 "nvme_iov_md": false 00:15:45.610 }, 00:15:45.610 "memory_domains": [ 00:15:45.610 { 00:15:45.610 "dma_device_id": "system", 00:15:45.610 "dma_device_type": 1 00:15:45.610 }, 00:15:45.610 { 00:15:45.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.610 "dma_device_type": 2 00:15:45.610 }, 00:15:45.610 { 00:15:45.610 "dma_device_id": "system", 00:15:45.610 "dma_device_type": 1 00:15:45.610 }, 00:15:45.610 { 00:15:45.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.610 "dma_device_type": 2 00:15:45.610 }, 00:15:45.610 { 00:15:45.610 "dma_device_id": "system", 00:15:45.610 "dma_device_type": 1 00:15:45.610 }, 00:15:45.610 { 00:15:45.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.610 "dma_device_type": 2 00:15:45.610 } 00:15:45.610 ], 00:15:45.610 "driver_specific": { 00:15:45.610 "raid": { 00:15:45.610 "uuid": "e1148327-3a66-40ee-ba95-5ec8b7053b12", 00:15:45.610 "strip_size_kb": 64, 00:15:45.610 "state": "online", 00:15:45.610 "raid_level": "concat", 00:15:45.610 "superblock": false, 00:15:45.610 "num_base_bdevs": 3, 00:15:45.610 "num_base_bdevs_discovered": 3, 00:15:45.610 "num_base_bdevs_operational": 3, 00:15:45.610 "base_bdevs_list": [ 00:15:45.610 { 00:15:45.610 "name": "NewBaseBdev", 00:15:45.610 "uuid": "b7504448-9181-4e00-911e-f255156649e2", 00:15:45.610 "is_configured": true, 00:15:45.610 "data_offset": 0, 00:15:45.610 "data_size": 65536 00:15:45.610 }, 00:15:45.610 { 00:15:45.610 "name": "BaseBdev2", 00:15:45.610 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:45.610 "is_configured": true, 00:15:45.610 "data_offset": 0, 00:15:45.610 "data_size": 65536 00:15:45.610 }, 00:15:45.610 { 00:15:45.610 "name": "BaseBdev3", 00:15:45.610 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:45.610 "is_configured": true, 00:15:45.610 "data_offset": 0, 00:15:45.610 "data_size": 65536 00:15:45.610 } 00:15:45.610 ] 00:15:45.610 } 00:15:45.610 } 00:15:45.610 }' 00:15:45.610 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:45.868 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:45.868 BaseBdev2 00:15:45.868 BaseBdev3' 00:15:45.868 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:45.868 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:45.868 20:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:45.868 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:45.868 "name": "NewBaseBdev", 00:15:45.868 "aliases": [ 00:15:45.868 "b7504448-9181-4e00-911e-f255156649e2" 00:15:45.868 ], 00:15:45.868 "product_name": "Malloc disk", 00:15:45.868 "block_size": 512, 00:15:45.868 "num_blocks": 65536, 00:15:45.868 "uuid": "b7504448-9181-4e00-911e-f255156649e2", 00:15:45.868 "assigned_rate_limits": { 00:15:45.868 "rw_ios_per_sec": 0, 00:15:45.868 "rw_mbytes_per_sec": 0, 00:15:45.868 "r_mbytes_per_sec": 0, 00:15:45.868 "w_mbytes_per_sec": 0 00:15:45.868 }, 00:15:45.868 "claimed": true, 00:15:45.868 "claim_type": "exclusive_write", 00:15:45.868 "zoned": false, 00:15:45.868 "supported_io_types": { 00:15:45.868 "read": true, 00:15:45.868 "write": true, 00:15:45.868 "unmap": true, 00:15:45.868 "flush": true, 00:15:45.868 "reset": true, 00:15:45.868 "nvme_admin": false, 00:15:45.868 "nvme_io": false, 00:15:45.868 "nvme_io_md": false, 00:15:45.868 "write_zeroes": true, 00:15:45.868 "zcopy": true, 00:15:45.868 "get_zone_info": false, 00:15:45.868 "zone_management": false, 00:15:45.868 "zone_append": false, 00:15:45.868 "compare": false, 00:15:45.868 "compare_and_write": false, 00:15:45.868 "abort": true, 00:15:45.868 "seek_hole": false, 00:15:45.868 "seek_data": false, 00:15:45.868 "copy": true, 00:15:45.868 "nvme_iov_md": false 00:15:45.868 }, 00:15:45.868 "memory_domains": [ 00:15:45.868 { 00:15:45.868 "dma_device_id": "system", 00:15:45.868 "dma_device_type": 1 00:15:45.868 }, 00:15:45.868 { 00:15:45.868 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.868 "dma_device_type": 2 00:15:45.868 } 00:15:45.868 ], 00:15:45.868 "driver_specific": {} 00:15:45.868 }' 00:15:45.868 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.124 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.124 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:46.124 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.124 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.124 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:46.124 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.124 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.381 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:46.381 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:46.381 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:46.381 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:46.381 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:46.381 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:46.381 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:46.652 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:46.652 "name": "BaseBdev2", 00:15:46.652 "aliases": [ 00:15:46.652 "818e1541-d282-4ede-97b5-7cf13d9dab95" 00:15:46.652 ], 00:15:46.652 "product_name": "Malloc disk", 00:15:46.652 "block_size": 512, 00:15:46.652 "num_blocks": 65536, 00:15:46.652 "uuid": "818e1541-d282-4ede-97b5-7cf13d9dab95", 00:15:46.652 "assigned_rate_limits": { 00:15:46.652 "rw_ios_per_sec": 0, 00:15:46.652 "rw_mbytes_per_sec": 0, 00:15:46.652 "r_mbytes_per_sec": 0, 00:15:46.652 "w_mbytes_per_sec": 0 00:15:46.652 }, 00:15:46.652 "claimed": true, 00:15:46.652 "claim_type": "exclusive_write", 00:15:46.652 "zoned": false, 00:15:46.652 "supported_io_types": { 00:15:46.652 "read": true, 00:15:46.652 "write": true, 00:15:46.652 "unmap": true, 00:15:46.652 "flush": true, 00:15:46.652 "reset": true, 00:15:46.652 "nvme_admin": false, 00:15:46.652 "nvme_io": false, 00:15:46.652 "nvme_io_md": false, 00:15:46.652 "write_zeroes": true, 00:15:46.652 "zcopy": true, 00:15:46.652 "get_zone_info": false, 00:15:46.652 "zone_management": false, 00:15:46.652 "zone_append": false, 00:15:46.652 "compare": false, 00:15:46.652 "compare_and_write": false, 00:15:46.652 "abort": true, 00:15:46.652 "seek_hole": false, 00:15:46.652 "seek_data": false, 00:15:46.652 "copy": true, 00:15:46.652 "nvme_iov_md": false 00:15:46.652 }, 00:15:46.652 "memory_domains": [ 00:15:46.652 { 00:15:46.652 "dma_device_id": "system", 00:15:46.652 "dma_device_type": 1 00:15:46.652 }, 00:15:46.652 { 00:15:46.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.652 "dma_device_type": 2 00:15:46.652 } 00:15:46.652 ], 00:15:46.652 "driver_specific": {} 00:15:46.652 }' 00:15:46.652 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.652 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.652 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:46.652 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.652 20:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.652 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:46.652 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.910 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.910 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:46.910 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:46.910 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:46.910 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:46.910 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:46.910 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:46.910 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:47.167 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:47.167 "name": "BaseBdev3", 00:15:47.167 "aliases": [ 00:15:47.167 "7299f911-5d96-4268-9c70-c473d9f2d0f9" 00:15:47.167 ], 00:15:47.167 "product_name": "Malloc disk", 00:15:47.167 "block_size": 512, 00:15:47.167 "num_blocks": 65536, 00:15:47.167 "uuid": "7299f911-5d96-4268-9c70-c473d9f2d0f9", 00:15:47.167 "assigned_rate_limits": { 00:15:47.167 "rw_ios_per_sec": 0, 00:15:47.167 "rw_mbytes_per_sec": 0, 00:15:47.167 "r_mbytes_per_sec": 0, 00:15:47.167 "w_mbytes_per_sec": 0 00:15:47.167 }, 00:15:47.167 "claimed": true, 00:15:47.167 "claim_type": "exclusive_write", 00:15:47.167 "zoned": false, 00:15:47.167 "supported_io_types": { 00:15:47.167 "read": true, 00:15:47.167 "write": true, 00:15:47.167 "unmap": true, 00:15:47.167 "flush": true, 00:15:47.167 "reset": true, 00:15:47.167 "nvme_admin": false, 00:15:47.167 "nvme_io": false, 00:15:47.167 "nvme_io_md": false, 00:15:47.167 "write_zeroes": true, 00:15:47.167 "zcopy": true, 00:15:47.167 "get_zone_info": false, 00:15:47.167 "zone_management": false, 00:15:47.167 "zone_append": false, 00:15:47.167 "compare": false, 00:15:47.167 "compare_and_write": false, 00:15:47.167 "abort": true, 00:15:47.167 "seek_hole": false, 00:15:47.167 "seek_data": false, 00:15:47.167 "copy": true, 00:15:47.167 "nvme_iov_md": false 00:15:47.167 }, 00:15:47.167 "memory_domains": [ 00:15:47.167 { 00:15:47.167 "dma_device_id": "system", 00:15:47.167 "dma_device_type": 1 00:15:47.167 }, 00:15:47.167 { 00:15:47.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.167 "dma_device_type": 2 00:15:47.167 } 00:15:47.167 ], 00:15:47.167 "driver_specific": {} 00:15:47.167 }' 00:15:47.167 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.167 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.167 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:47.167 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.425 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.425 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:47.425 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.425 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.425 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:47.425 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.425 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.425 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:47.425 20:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:47.682 [2024-07-15 20:29:40.000347] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:47.682 [2024-07-15 20:29:40.000374] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:47.682 [2024-07-15 20:29:40.000424] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:47.682 [2024-07-15 20:29:40.000474] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:47.682 [2024-07-15 20:29:40.000487] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf40450 name Existed_Raid, state offline 00:15:47.682 20:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1383934 00:15:47.682 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1383934 ']' 00:15:47.682 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1383934 00:15:47.682 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:47.682 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:47.682 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1383934 00:15:47.940 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:47.940 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:47.940 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1383934' 00:15:47.940 killing process with pid 1383934 00:15:47.940 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1383934 00:15:47.940 [2024-07-15 20:29:40.070346] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:47.940 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1383934 00:15:47.940 [2024-07-15 20:29:40.100618] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:47.940 20:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:47.940 00:15:47.940 real 0m34.390s 00:15:47.940 user 1m3.625s 00:15:47.940 sys 0m5.969s 00:15:47.940 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:47.940 20:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:47.940 ************************************ 00:15:47.940 END TEST raid_state_function_test 00:15:47.940 ************************************ 00:15:48.197 20:29:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:48.197 20:29:40 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:48.197 20:29:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:48.197 20:29:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:48.197 20:29:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:48.197 ************************************ 00:15:48.197 START TEST raid_state_function_test_sb 00:15:48.197 ************************************ 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1388917 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1388917' 00:15:48.197 Process raid pid: 1388917 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1388917 /var/tmp/spdk-raid.sock 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1388917 ']' 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:48.197 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:48.197 20:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:48.197 [2024-07-15 20:29:40.471238] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:15:48.197 [2024-07-15 20:29:40.471323] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:48.455 [2024-07-15 20:29:40.617231] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:48.455 [2024-07-15 20:29:40.726588] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:48.455 [2024-07-15 20:29:40.781511] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:48.455 [2024-07-15 20:29:40.781539] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:48.713 20:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:48.713 20:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:48.713 20:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:48.972 [2024-07-15 20:29:41.167591] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:48.972 [2024-07-15 20:29:41.167629] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:48.972 [2024-07-15 20:29:41.167639] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:48.972 [2024-07-15 20:29:41.167651] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:48.972 [2024-07-15 20:29:41.167660] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:48.972 [2024-07-15 20:29:41.167671] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.972 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:49.231 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.231 "name": "Existed_Raid", 00:15:49.231 "uuid": "44e4ad63-ac9a-4aaf-88a3-12fbee19c976", 00:15:49.231 "strip_size_kb": 64, 00:15:49.231 "state": "configuring", 00:15:49.231 "raid_level": "concat", 00:15:49.231 "superblock": true, 00:15:49.231 "num_base_bdevs": 3, 00:15:49.231 "num_base_bdevs_discovered": 0, 00:15:49.231 "num_base_bdevs_operational": 3, 00:15:49.231 "base_bdevs_list": [ 00:15:49.231 { 00:15:49.231 "name": "BaseBdev1", 00:15:49.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.231 "is_configured": false, 00:15:49.231 "data_offset": 0, 00:15:49.231 "data_size": 0 00:15:49.231 }, 00:15:49.231 { 00:15:49.231 "name": "BaseBdev2", 00:15:49.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.231 "is_configured": false, 00:15:49.231 "data_offset": 0, 00:15:49.231 "data_size": 0 00:15:49.231 }, 00:15:49.231 { 00:15:49.231 "name": "BaseBdev3", 00:15:49.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.231 "is_configured": false, 00:15:49.231 "data_offset": 0, 00:15:49.231 "data_size": 0 00:15:49.231 } 00:15:49.231 ] 00:15:49.231 }' 00:15:49.231 20:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.231 20:29:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:49.798 20:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:50.058 [2024-07-15 20:29:42.250297] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:50.058 [2024-07-15 20:29:42.250328] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a0a80 name Existed_Raid, state configuring 00:15:50.058 20:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:50.318 [2024-07-15 20:29:42.490976] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:50.318 [2024-07-15 20:29:42.491016] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:50.318 [2024-07-15 20:29:42.491026] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:50.318 [2024-07-15 20:29:42.491038] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:50.318 [2024-07-15 20:29:42.491047] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:50.318 [2024-07-15 20:29:42.491057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:50.318 20:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:50.576 [2024-07-15 20:29:42.734638] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:50.576 BaseBdev1 00:15:50.576 20:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:50.576 20:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:50.576 20:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:50.576 20:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:50.576 20:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:50.576 20:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:50.576 20:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:50.835 20:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:51.094 [ 00:15:51.094 { 00:15:51.094 "name": "BaseBdev1", 00:15:51.094 "aliases": [ 00:15:51.094 "cc85c023-219d-4607-94d3-3d393efb5810" 00:15:51.094 ], 00:15:51.094 "product_name": "Malloc disk", 00:15:51.094 "block_size": 512, 00:15:51.094 "num_blocks": 65536, 00:15:51.094 "uuid": "cc85c023-219d-4607-94d3-3d393efb5810", 00:15:51.094 "assigned_rate_limits": { 00:15:51.094 "rw_ios_per_sec": 0, 00:15:51.094 "rw_mbytes_per_sec": 0, 00:15:51.094 "r_mbytes_per_sec": 0, 00:15:51.094 "w_mbytes_per_sec": 0 00:15:51.094 }, 00:15:51.094 "claimed": true, 00:15:51.094 "claim_type": "exclusive_write", 00:15:51.094 "zoned": false, 00:15:51.094 "supported_io_types": { 00:15:51.094 "read": true, 00:15:51.094 "write": true, 00:15:51.094 "unmap": true, 00:15:51.094 "flush": true, 00:15:51.094 "reset": true, 00:15:51.094 "nvme_admin": false, 00:15:51.094 "nvme_io": false, 00:15:51.094 "nvme_io_md": false, 00:15:51.094 "write_zeroes": true, 00:15:51.094 "zcopy": true, 00:15:51.094 "get_zone_info": false, 00:15:51.094 "zone_management": false, 00:15:51.094 "zone_append": false, 00:15:51.094 "compare": false, 00:15:51.094 "compare_and_write": false, 00:15:51.094 "abort": true, 00:15:51.094 "seek_hole": false, 00:15:51.094 "seek_data": false, 00:15:51.094 "copy": true, 00:15:51.094 "nvme_iov_md": false 00:15:51.094 }, 00:15:51.094 "memory_domains": [ 00:15:51.094 { 00:15:51.094 "dma_device_id": "system", 00:15:51.094 "dma_device_type": 1 00:15:51.094 }, 00:15:51.094 { 00:15:51.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.094 "dma_device_type": 2 00:15:51.094 } 00:15:51.094 ], 00:15:51.094 "driver_specific": {} 00:15:51.094 } 00:15:51.094 ] 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.094 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:51.351 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:51.351 "name": "Existed_Raid", 00:15:51.351 "uuid": "1a20f46b-d35a-43ea-baf9-f7982d76f7e2", 00:15:51.351 "strip_size_kb": 64, 00:15:51.351 "state": "configuring", 00:15:51.352 "raid_level": "concat", 00:15:51.352 "superblock": true, 00:15:51.352 "num_base_bdevs": 3, 00:15:51.352 "num_base_bdevs_discovered": 1, 00:15:51.352 "num_base_bdevs_operational": 3, 00:15:51.352 "base_bdevs_list": [ 00:15:51.352 { 00:15:51.352 "name": "BaseBdev1", 00:15:51.352 "uuid": "cc85c023-219d-4607-94d3-3d393efb5810", 00:15:51.352 "is_configured": true, 00:15:51.352 "data_offset": 2048, 00:15:51.352 "data_size": 63488 00:15:51.352 }, 00:15:51.352 { 00:15:51.352 "name": "BaseBdev2", 00:15:51.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:51.352 "is_configured": false, 00:15:51.352 "data_offset": 0, 00:15:51.352 "data_size": 0 00:15:51.352 }, 00:15:51.352 { 00:15:51.352 "name": "BaseBdev3", 00:15:51.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:51.352 "is_configured": false, 00:15:51.352 "data_offset": 0, 00:15:51.352 "data_size": 0 00:15:51.352 } 00:15:51.352 ] 00:15:51.352 }' 00:15:51.352 20:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:51.352 20:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:51.917 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:51.917 [2024-07-15 20:29:44.282755] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:51.917 [2024-07-15 20:29:44.282795] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a0310 name Existed_Raid, state configuring 00:15:52.176 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:52.176 [2024-07-15 20:29:44.531462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:52.176 [2024-07-15 20:29:44.532907] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:52.176 [2024-07-15 20:29:44.532949] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:52.176 [2024-07-15 20:29:44.532963] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:52.176 [2024-07-15 20:29:44.532975] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:52.176 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.434 "name": "Existed_Raid", 00:15:52.434 "uuid": "2a2d3eef-077c-47b7-927e-31196415454a", 00:15:52.434 "strip_size_kb": 64, 00:15:52.434 "state": "configuring", 00:15:52.434 "raid_level": "concat", 00:15:52.434 "superblock": true, 00:15:52.434 "num_base_bdevs": 3, 00:15:52.434 "num_base_bdevs_discovered": 1, 00:15:52.434 "num_base_bdevs_operational": 3, 00:15:52.434 "base_bdevs_list": [ 00:15:52.434 { 00:15:52.434 "name": "BaseBdev1", 00:15:52.434 "uuid": "cc85c023-219d-4607-94d3-3d393efb5810", 00:15:52.434 "is_configured": true, 00:15:52.434 "data_offset": 2048, 00:15:52.434 "data_size": 63488 00:15:52.434 }, 00:15:52.434 { 00:15:52.434 "name": "BaseBdev2", 00:15:52.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:52.434 "is_configured": false, 00:15:52.434 "data_offset": 0, 00:15:52.434 "data_size": 0 00:15:52.434 }, 00:15:52.434 { 00:15:52.434 "name": "BaseBdev3", 00:15:52.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:52.434 "is_configured": false, 00:15:52.434 "data_offset": 0, 00:15:52.434 "data_size": 0 00:15:52.434 } 00:15:52.434 ] 00:15:52.434 }' 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.434 20:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:53.367 20:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:53.367 [2024-07-15 20:29:45.629715] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:53.367 BaseBdev2 00:15:53.367 20:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:53.367 20:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:53.367 20:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:53.367 20:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:53.367 20:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:53.367 20:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:53.367 20:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:53.625 20:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:53.883 [ 00:15:53.883 { 00:15:53.883 "name": "BaseBdev2", 00:15:53.883 "aliases": [ 00:15:53.883 "29b6c009-9e9c-43b4-b637-e9102dbdfb97" 00:15:53.883 ], 00:15:53.883 "product_name": "Malloc disk", 00:15:53.883 "block_size": 512, 00:15:53.883 "num_blocks": 65536, 00:15:53.883 "uuid": "29b6c009-9e9c-43b4-b637-e9102dbdfb97", 00:15:53.883 "assigned_rate_limits": { 00:15:53.883 "rw_ios_per_sec": 0, 00:15:53.883 "rw_mbytes_per_sec": 0, 00:15:53.883 "r_mbytes_per_sec": 0, 00:15:53.883 "w_mbytes_per_sec": 0 00:15:53.883 }, 00:15:53.883 "claimed": true, 00:15:53.883 "claim_type": "exclusive_write", 00:15:53.883 "zoned": false, 00:15:53.883 "supported_io_types": { 00:15:53.883 "read": true, 00:15:53.883 "write": true, 00:15:53.883 "unmap": true, 00:15:53.883 "flush": true, 00:15:53.883 "reset": true, 00:15:53.883 "nvme_admin": false, 00:15:53.883 "nvme_io": false, 00:15:53.883 "nvme_io_md": false, 00:15:53.883 "write_zeroes": true, 00:15:53.883 "zcopy": true, 00:15:53.883 "get_zone_info": false, 00:15:53.883 "zone_management": false, 00:15:53.883 "zone_append": false, 00:15:53.883 "compare": false, 00:15:53.883 "compare_and_write": false, 00:15:53.883 "abort": true, 00:15:53.883 "seek_hole": false, 00:15:53.883 "seek_data": false, 00:15:53.883 "copy": true, 00:15:53.883 "nvme_iov_md": false 00:15:53.883 }, 00:15:53.883 "memory_domains": [ 00:15:53.883 { 00:15:53.883 "dma_device_id": "system", 00:15:53.883 "dma_device_type": 1 00:15:53.883 }, 00:15:53.883 { 00:15:53.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.883 "dma_device_type": 2 00:15:53.883 } 00:15:53.883 ], 00:15:53.883 "driver_specific": {} 00:15:53.883 } 00:15:53.883 ] 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.883 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:54.141 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:54.141 "name": "Existed_Raid", 00:15:54.141 "uuid": "2a2d3eef-077c-47b7-927e-31196415454a", 00:15:54.141 "strip_size_kb": 64, 00:15:54.141 "state": "configuring", 00:15:54.141 "raid_level": "concat", 00:15:54.141 "superblock": true, 00:15:54.141 "num_base_bdevs": 3, 00:15:54.141 "num_base_bdevs_discovered": 2, 00:15:54.141 "num_base_bdevs_operational": 3, 00:15:54.141 "base_bdevs_list": [ 00:15:54.141 { 00:15:54.141 "name": "BaseBdev1", 00:15:54.141 "uuid": "cc85c023-219d-4607-94d3-3d393efb5810", 00:15:54.141 "is_configured": true, 00:15:54.141 "data_offset": 2048, 00:15:54.141 "data_size": 63488 00:15:54.141 }, 00:15:54.141 { 00:15:54.141 "name": "BaseBdev2", 00:15:54.141 "uuid": "29b6c009-9e9c-43b4-b637-e9102dbdfb97", 00:15:54.141 "is_configured": true, 00:15:54.141 "data_offset": 2048, 00:15:54.141 "data_size": 63488 00:15:54.141 }, 00:15:54.141 { 00:15:54.141 "name": "BaseBdev3", 00:15:54.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:54.141 "is_configured": false, 00:15:54.141 "data_offset": 0, 00:15:54.141 "data_size": 0 00:15:54.141 } 00:15:54.141 ] 00:15:54.141 }' 00:15:54.141 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:54.141 20:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:54.709 20:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:54.968 [2024-07-15 20:29:47.209377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:54.968 [2024-07-15 20:29:47.209536] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15a1400 00:15:54.968 [2024-07-15 20:29:47.209550] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:54.968 [2024-07-15 20:29:47.209722] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a0ef0 00:15:54.968 [2024-07-15 20:29:47.209838] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15a1400 00:15:54.968 [2024-07-15 20:29:47.209848] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15a1400 00:15:54.968 [2024-07-15 20:29:47.209945] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:54.968 BaseBdev3 00:15:54.968 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:54.968 20:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:54.968 20:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:54.968 20:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:54.968 20:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:54.968 20:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:54.968 20:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:55.226 20:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:55.485 [ 00:15:55.485 { 00:15:55.485 "name": "BaseBdev3", 00:15:55.485 "aliases": [ 00:15:55.485 "3ebe5f16-e2c9-4448-b518-338c72f19d69" 00:15:55.485 ], 00:15:55.485 "product_name": "Malloc disk", 00:15:55.485 "block_size": 512, 00:15:55.485 "num_blocks": 65536, 00:15:55.485 "uuid": "3ebe5f16-e2c9-4448-b518-338c72f19d69", 00:15:55.485 "assigned_rate_limits": { 00:15:55.485 "rw_ios_per_sec": 0, 00:15:55.485 "rw_mbytes_per_sec": 0, 00:15:55.485 "r_mbytes_per_sec": 0, 00:15:55.485 "w_mbytes_per_sec": 0 00:15:55.485 }, 00:15:55.485 "claimed": true, 00:15:55.485 "claim_type": "exclusive_write", 00:15:55.485 "zoned": false, 00:15:55.485 "supported_io_types": { 00:15:55.485 "read": true, 00:15:55.485 "write": true, 00:15:55.485 "unmap": true, 00:15:55.485 "flush": true, 00:15:55.485 "reset": true, 00:15:55.485 "nvme_admin": false, 00:15:55.485 "nvme_io": false, 00:15:55.485 "nvme_io_md": false, 00:15:55.485 "write_zeroes": true, 00:15:55.485 "zcopy": true, 00:15:55.485 "get_zone_info": false, 00:15:55.485 "zone_management": false, 00:15:55.485 "zone_append": false, 00:15:55.485 "compare": false, 00:15:55.485 "compare_and_write": false, 00:15:55.485 "abort": true, 00:15:55.485 "seek_hole": false, 00:15:55.485 "seek_data": false, 00:15:55.485 "copy": true, 00:15:55.485 "nvme_iov_md": false 00:15:55.485 }, 00:15:55.485 "memory_domains": [ 00:15:55.485 { 00:15:55.485 "dma_device_id": "system", 00:15:55.485 "dma_device_type": 1 00:15:55.485 }, 00:15:55.485 { 00:15:55.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.485 "dma_device_type": 2 00:15:55.485 } 00:15:55.485 ], 00:15:55.485 "driver_specific": {} 00:15:55.485 } 00:15:55.485 ] 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.485 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:55.743 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.743 "name": "Existed_Raid", 00:15:55.743 "uuid": "2a2d3eef-077c-47b7-927e-31196415454a", 00:15:55.743 "strip_size_kb": 64, 00:15:55.743 "state": "online", 00:15:55.743 "raid_level": "concat", 00:15:55.743 "superblock": true, 00:15:55.743 "num_base_bdevs": 3, 00:15:55.743 "num_base_bdevs_discovered": 3, 00:15:55.743 "num_base_bdevs_operational": 3, 00:15:55.743 "base_bdevs_list": [ 00:15:55.743 { 00:15:55.743 "name": "BaseBdev1", 00:15:55.743 "uuid": "cc85c023-219d-4607-94d3-3d393efb5810", 00:15:55.743 "is_configured": true, 00:15:55.743 "data_offset": 2048, 00:15:55.743 "data_size": 63488 00:15:55.743 }, 00:15:55.743 { 00:15:55.743 "name": "BaseBdev2", 00:15:55.743 "uuid": "29b6c009-9e9c-43b4-b637-e9102dbdfb97", 00:15:55.743 "is_configured": true, 00:15:55.743 "data_offset": 2048, 00:15:55.743 "data_size": 63488 00:15:55.743 }, 00:15:55.743 { 00:15:55.743 "name": "BaseBdev3", 00:15:55.743 "uuid": "3ebe5f16-e2c9-4448-b518-338c72f19d69", 00:15:55.743 "is_configured": true, 00:15:55.743 "data_offset": 2048, 00:15:55.743 "data_size": 63488 00:15:55.743 } 00:15:55.743 ] 00:15:55.743 }' 00:15:55.743 20:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.743 20:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:56.329 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:56.329 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:56.329 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:56.329 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:56.329 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:56.329 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:56.329 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:56.329 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:56.329 [2024-07-15 20:29:48.645604] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:56.329 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:56.329 "name": "Existed_Raid", 00:15:56.329 "aliases": [ 00:15:56.329 "2a2d3eef-077c-47b7-927e-31196415454a" 00:15:56.329 ], 00:15:56.329 "product_name": "Raid Volume", 00:15:56.329 "block_size": 512, 00:15:56.329 "num_blocks": 190464, 00:15:56.329 "uuid": "2a2d3eef-077c-47b7-927e-31196415454a", 00:15:56.329 "assigned_rate_limits": { 00:15:56.329 "rw_ios_per_sec": 0, 00:15:56.329 "rw_mbytes_per_sec": 0, 00:15:56.329 "r_mbytes_per_sec": 0, 00:15:56.329 "w_mbytes_per_sec": 0 00:15:56.329 }, 00:15:56.329 "claimed": false, 00:15:56.329 "zoned": false, 00:15:56.329 "supported_io_types": { 00:15:56.329 "read": true, 00:15:56.329 "write": true, 00:15:56.329 "unmap": true, 00:15:56.329 "flush": true, 00:15:56.329 "reset": true, 00:15:56.329 "nvme_admin": false, 00:15:56.329 "nvme_io": false, 00:15:56.329 "nvme_io_md": false, 00:15:56.329 "write_zeroes": true, 00:15:56.329 "zcopy": false, 00:15:56.329 "get_zone_info": false, 00:15:56.329 "zone_management": false, 00:15:56.329 "zone_append": false, 00:15:56.329 "compare": false, 00:15:56.329 "compare_and_write": false, 00:15:56.329 "abort": false, 00:15:56.329 "seek_hole": false, 00:15:56.329 "seek_data": false, 00:15:56.329 "copy": false, 00:15:56.329 "nvme_iov_md": false 00:15:56.329 }, 00:15:56.329 "memory_domains": [ 00:15:56.329 { 00:15:56.329 "dma_device_id": "system", 00:15:56.329 "dma_device_type": 1 00:15:56.329 }, 00:15:56.329 { 00:15:56.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:56.329 "dma_device_type": 2 00:15:56.329 }, 00:15:56.329 { 00:15:56.329 "dma_device_id": "system", 00:15:56.329 "dma_device_type": 1 00:15:56.329 }, 00:15:56.329 { 00:15:56.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:56.329 "dma_device_type": 2 00:15:56.329 }, 00:15:56.329 { 00:15:56.329 "dma_device_id": "system", 00:15:56.329 "dma_device_type": 1 00:15:56.329 }, 00:15:56.329 { 00:15:56.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:56.329 "dma_device_type": 2 00:15:56.329 } 00:15:56.329 ], 00:15:56.329 "driver_specific": { 00:15:56.329 "raid": { 00:15:56.329 "uuid": "2a2d3eef-077c-47b7-927e-31196415454a", 00:15:56.329 "strip_size_kb": 64, 00:15:56.329 "state": "online", 00:15:56.329 "raid_level": "concat", 00:15:56.329 "superblock": true, 00:15:56.329 "num_base_bdevs": 3, 00:15:56.329 "num_base_bdevs_discovered": 3, 00:15:56.329 "num_base_bdevs_operational": 3, 00:15:56.329 "base_bdevs_list": [ 00:15:56.329 { 00:15:56.329 "name": "BaseBdev1", 00:15:56.329 "uuid": "cc85c023-219d-4607-94d3-3d393efb5810", 00:15:56.329 "is_configured": true, 00:15:56.329 "data_offset": 2048, 00:15:56.329 "data_size": 63488 00:15:56.329 }, 00:15:56.329 { 00:15:56.329 "name": "BaseBdev2", 00:15:56.329 "uuid": "29b6c009-9e9c-43b4-b637-e9102dbdfb97", 00:15:56.329 "is_configured": true, 00:15:56.329 "data_offset": 2048, 00:15:56.329 "data_size": 63488 00:15:56.329 }, 00:15:56.329 { 00:15:56.329 "name": "BaseBdev3", 00:15:56.329 "uuid": "3ebe5f16-e2c9-4448-b518-338c72f19d69", 00:15:56.329 "is_configured": true, 00:15:56.329 "data_offset": 2048, 00:15:56.329 "data_size": 63488 00:15:56.329 } 00:15:56.329 ] 00:15:56.329 } 00:15:56.329 } 00:15:56.329 }' 00:15:56.329 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:56.611 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:56.611 BaseBdev2 00:15:56.611 BaseBdev3' 00:15:56.611 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:56.611 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:56.611 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:56.611 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:56.611 "name": "BaseBdev1", 00:15:56.611 "aliases": [ 00:15:56.611 "cc85c023-219d-4607-94d3-3d393efb5810" 00:15:56.611 ], 00:15:56.611 "product_name": "Malloc disk", 00:15:56.611 "block_size": 512, 00:15:56.611 "num_blocks": 65536, 00:15:56.611 "uuid": "cc85c023-219d-4607-94d3-3d393efb5810", 00:15:56.611 "assigned_rate_limits": { 00:15:56.611 "rw_ios_per_sec": 0, 00:15:56.611 "rw_mbytes_per_sec": 0, 00:15:56.611 "r_mbytes_per_sec": 0, 00:15:56.611 "w_mbytes_per_sec": 0 00:15:56.611 }, 00:15:56.611 "claimed": true, 00:15:56.611 "claim_type": "exclusive_write", 00:15:56.611 "zoned": false, 00:15:56.611 "supported_io_types": { 00:15:56.611 "read": true, 00:15:56.611 "write": true, 00:15:56.611 "unmap": true, 00:15:56.611 "flush": true, 00:15:56.611 "reset": true, 00:15:56.612 "nvme_admin": false, 00:15:56.612 "nvme_io": false, 00:15:56.612 "nvme_io_md": false, 00:15:56.612 "write_zeroes": true, 00:15:56.612 "zcopy": true, 00:15:56.612 "get_zone_info": false, 00:15:56.612 "zone_management": false, 00:15:56.612 "zone_append": false, 00:15:56.612 "compare": false, 00:15:56.612 "compare_and_write": false, 00:15:56.612 "abort": true, 00:15:56.612 "seek_hole": false, 00:15:56.612 "seek_data": false, 00:15:56.612 "copy": true, 00:15:56.612 "nvme_iov_md": false 00:15:56.612 }, 00:15:56.612 "memory_domains": [ 00:15:56.612 { 00:15:56.612 "dma_device_id": "system", 00:15:56.612 "dma_device_type": 1 00:15:56.612 }, 00:15:56.612 { 00:15:56.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:56.612 "dma_device_type": 2 00:15:56.612 } 00:15:56.612 ], 00:15:56.612 "driver_specific": {} 00:15:56.612 }' 00:15:56.612 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:56.612 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:56.612 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:56.612 20:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:56.870 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:56.870 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:56.870 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:56.870 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:56.870 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:56.870 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:56.870 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:57.130 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:57.130 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:57.130 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:57.130 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:57.389 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:57.389 "name": "BaseBdev2", 00:15:57.389 "aliases": [ 00:15:57.389 "29b6c009-9e9c-43b4-b637-e9102dbdfb97" 00:15:57.389 ], 00:15:57.389 "product_name": "Malloc disk", 00:15:57.389 "block_size": 512, 00:15:57.389 "num_blocks": 65536, 00:15:57.389 "uuid": "29b6c009-9e9c-43b4-b637-e9102dbdfb97", 00:15:57.389 "assigned_rate_limits": { 00:15:57.389 "rw_ios_per_sec": 0, 00:15:57.389 "rw_mbytes_per_sec": 0, 00:15:57.389 "r_mbytes_per_sec": 0, 00:15:57.389 "w_mbytes_per_sec": 0 00:15:57.389 }, 00:15:57.389 "claimed": true, 00:15:57.389 "claim_type": "exclusive_write", 00:15:57.389 "zoned": false, 00:15:57.389 "supported_io_types": { 00:15:57.389 "read": true, 00:15:57.389 "write": true, 00:15:57.389 "unmap": true, 00:15:57.389 "flush": true, 00:15:57.389 "reset": true, 00:15:57.389 "nvme_admin": false, 00:15:57.389 "nvme_io": false, 00:15:57.389 "nvme_io_md": false, 00:15:57.389 "write_zeroes": true, 00:15:57.389 "zcopy": true, 00:15:57.389 "get_zone_info": false, 00:15:57.389 "zone_management": false, 00:15:57.389 "zone_append": false, 00:15:57.389 "compare": false, 00:15:57.389 "compare_and_write": false, 00:15:57.389 "abort": true, 00:15:57.389 "seek_hole": false, 00:15:57.389 "seek_data": false, 00:15:57.389 "copy": true, 00:15:57.389 "nvme_iov_md": false 00:15:57.389 }, 00:15:57.389 "memory_domains": [ 00:15:57.389 { 00:15:57.389 "dma_device_id": "system", 00:15:57.389 "dma_device_type": 1 00:15:57.389 }, 00:15:57.389 { 00:15:57.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.389 "dma_device_type": 2 00:15:57.389 } 00:15:57.389 ], 00:15:57.389 "driver_specific": {} 00:15:57.389 }' 00:15:57.389 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:57.389 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:57.389 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:57.389 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:57.389 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:57.389 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:57.389 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:57.389 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:57.649 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:57.649 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:57.649 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:57.649 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:57.649 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:57.649 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:57.649 20:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:57.908 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:57.908 "name": "BaseBdev3", 00:15:57.908 "aliases": [ 00:15:57.908 "3ebe5f16-e2c9-4448-b518-338c72f19d69" 00:15:57.908 ], 00:15:57.908 "product_name": "Malloc disk", 00:15:57.908 "block_size": 512, 00:15:57.908 "num_blocks": 65536, 00:15:57.908 "uuid": "3ebe5f16-e2c9-4448-b518-338c72f19d69", 00:15:57.908 "assigned_rate_limits": { 00:15:57.908 "rw_ios_per_sec": 0, 00:15:57.908 "rw_mbytes_per_sec": 0, 00:15:57.908 "r_mbytes_per_sec": 0, 00:15:57.908 "w_mbytes_per_sec": 0 00:15:57.908 }, 00:15:57.908 "claimed": true, 00:15:57.908 "claim_type": "exclusive_write", 00:15:57.908 "zoned": false, 00:15:57.908 "supported_io_types": { 00:15:57.908 "read": true, 00:15:57.908 "write": true, 00:15:57.908 "unmap": true, 00:15:57.908 "flush": true, 00:15:57.908 "reset": true, 00:15:57.908 "nvme_admin": false, 00:15:57.908 "nvme_io": false, 00:15:57.908 "nvme_io_md": false, 00:15:57.908 "write_zeroes": true, 00:15:57.908 "zcopy": true, 00:15:57.908 "get_zone_info": false, 00:15:57.908 "zone_management": false, 00:15:57.908 "zone_append": false, 00:15:57.908 "compare": false, 00:15:57.908 "compare_and_write": false, 00:15:57.908 "abort": true, 00:15:57.908 "seek_hole": false, 00:15:57.908 "seek_data": false, 00:15:57.908 "copy": true, 00:15:57.908 "nvme_iov_md": false 00:15:57.908 }, 00:15:57.908 "memory_domains": [ 00:15:57.908 { 00:15:57.908 "dma_device_id": "system", 00:15:57.908 "dma_device_type": 1 00:15:57.908 }, 00:15:57.908 { 00:15:57.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.908 "dma_device_type": 2 00:15:57.908 } 00:15:57.908 ], 00:15:57.908 "driver_specific": {} 00:15:57.908 }' 00:15:57.908 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:57.908 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:57.908 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:57.908 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:57.908 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:58.167 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:58.167 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:58.167 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:58.167 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:58.167 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:58.167 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:58.167 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:58.167 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:58.426 [2024-07-15 20:29:50.734910] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:58.426 [2024-07-15 20:29:50.734942] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:58.426 [2024-07-15 20:29:50.734982] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.426 20:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:58.685 20:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.685 "name": "Existed_Raid", 00:15:58.685 "uuid": "2a2d3eef-077c-47b7-927e-31196415454a", 00:15:58.685 "strip_size_kb": 64, 00:15:58.685 "state": "offline", 00:15:58.685 "raid_level": "concat", 00:15:58.685 "superblock": true, 00:15:58.685 "num_base_bdevs": 3, 00:15:58.685 "num_base_bdevs_discovered": 2, 00:15:58.685 "num_base_bdevs_operational": 2, 00:15:58.685 "base_bdevs_list": [ 00:15:58.685 { 00:15:58.685 "name": null, 00:15:58.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:58.685 "is_configured": false, 00:15:58.685 "data_offset": 2048, 00:15:58.685 "data_size": 63488 00:15:58.685 }, 00:15:58.685 { 00:15:58.685 "name": "BaseBdev2", 00:15:58.685 "uuid": "29b6c009-9e9c-43b4-b637-e9102dbdfb97", 00:15:58.685 "is_configured": true, 00:15:58.685 "data_offset": 2048, 00:15:58.685 "data_size": 63488 00:15:58.685 }, 00:15:58.685 { 00:15:58.685 "name": "BaseBdev3", 00:15:58.685 "uuid": "3ebe5f16-e2c9-4448-b518-338c72f19d69", 00:15:58.685 "is_configured": true, 00:15:58.685 "data_offset": 2048, 00:15:58.685 "data_size": 63488 00:15:58.685 } 00:15:58.685 ] 00:15:58.685 }' 00:15:58.685 20:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.685 20:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:59.253 20:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:59.253 20:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:59.253 20:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.253 20:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:59.512 20:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:59.512 20:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:59.513 20:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:59.771 [2024-07-15 20:29:52.067439] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:59.771 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:59.771 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:59.771 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.771 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:00.030 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:00.030 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:00.030 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:00.291 [2024-07-15 20:29:52.559705] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:00.291 [2024-07-15 20:29:52.559747] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a1400 name Existed_Raid, state offline 00:16:00.291 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:00.291 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:00.291 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.291 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:00.550 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:00.550 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:00.550 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:00.550 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:00.550 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:00.550 20:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:00.810 BaseBdev2 00:16:00.810 20:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:00.810 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:00.810 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:00.810 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:00.810 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:00.810 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:00.810 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:01.068 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:01.326 [ 00:16:01.326 { 00:16:01.326 "name": "BaseBdev2", 00:16:01.326 "aliases": [ 00:16:01.326 "ce4ca2b9-775a-46ac-ad97-5e3980c4194c" 00:16:01.326 ], 00:16:01.326 "product_name": "Malloc disk", 00:16:01.326 "block_size": 512, 00:16:01.326 "num_blocks": 65536, 00:16:01.326 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:01.326 "assigned_rate_limits": { 00:16:01.326 "rw_ios_per_sec": 0, 00:16:01.326 "rw_mbytes_per_sec": 0, 00:16:01.326 "r_mbytes_per_sec": 0, 00:16:01.326 "w_mbytes_per_sec": 0 00:16:01.326 }, 00:16:01.326 "claimed": false, 00:16:01.326 "zoned": false, 00:16:01.326 "supported_io_types": { 00:16:01.326 "read": true, 00:16:01.326 "write": true, 00:16:01.326 "unmap": true, 00:16:01.326 "flush": true, 00:16:01.326 "reset": true, 00:16:01.326 "nvme_admin": false, 00:16:01.326 "nvme_io": false, 00:16:01.326 "nvme_io_md": false, 00:16:01.326 "write_zeroes": true, 00:16:01.326 "zcopy": true, 00:16:01.326 "get_zone_info": false, 00:16:01.326 "zone_management": false, 00:16:01.326 "zone_append": false, 00:16:01.326 "compare": false, 00:16:01.326 "compare_and_write": false, 00:16:01.326 "abort": true, 00:16:01.326 "seek_hole": false, 00:16:01.326 "seek_data": false, 00:16:01.326 "copy": true, 00:16:01.326 "nvme_iov_md": false 00:16:01.326 }, 00:16:01.326 "memory_domains": [ 00:16:01.326 { 00:16:01.326 "dma_device_id": "system", 00:16:01.326 "dma_device_type": 1 00:16:01.326 }, 00:16:01.326 { 00:16:01.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.326 "dma_device_type": 2 00:16:01.326 } 00:16:01.326 ], 00:16:01.326 "driver_specific": {} 00:16:01.326 } 00:16:01.326 ] 00:16:01.326 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:01.326 20:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:01.326 20:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:01.326 20:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:01.585 BaseBdev3 00:16:01.585 20:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:01.585 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:01.585 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:01.585 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:01.585 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:01.585 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:01.585 20:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:01.845 20:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:02.104 [ 00:16:02.104 { 00:16:02.104 "name": "BaseBdev3", 00:16:02.104 "aliases": [ 00:16:02.104 "4a1b6099-8b27-488b-8258-c841733c1940" 00:16:02.104 ], 00:16:02.104 "product_name": "Malloc disk", 00:16:02.104 "block_size": 512, 00:16:02.104 "num_blocks": 65536, 00:16:02.104 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:02.104 "assigned_rate_limits": { 00:16:02.104 "rw_ios_per_sec": 0, 00:16:02.104 "rw_mbytes_per_sec": 0, 00:16:02.104 "r_mbytes_per_sec": 0, 00:16:02.104 "w_mbytes_per_sec": 0 00:16:02.104 }, 00:16:02.104 "claimed": false, 00:16:02.104 "zoned": false, 00:16:02.104 "supported_io_types": { 00:16:02.104 "read": true, 00:16:02.104 "write": true, 00:16:02.104 "unmap": true, 00:16:02.104 "flush": true, 00:16:02.104 "reset": true, 00:16:02.104 "nvme_admin": false, 00:16:02.104 "nvme_io": false, 00:16:02.104 "nvme_io_md": false, 00:16:02.104 "write_zeroes": true, 00:16:02.104 "zcopy": true, 00:16:02.104 "get_zone_info": false, 00:16:02.104 "zone_management": false, 00:16:02.104 "zone_append": false, 00:16:02.104 "compare": false, 00:16:02.104 "compare_and_write": false, 00:16:02.104 "abort": true, 00:16:02.104 "seek_hole": false, 00:16:02.104 "seek_data": false, 00:16:02.104 "copy": true, 00:16:02.104 "nvme_iov_md": false 00:16:02.104 }, 00:16:02.104 "memory_domains": [ 00:16:02.104 { 00:16:02.104 "dma_device_id": "system", 00:16:02.104 "dma_device_type": 1 00:16:02.104 }, 00:16:02.104 { 00:16:02.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.104 "dma_device_type": 2 00:16:02.104 } 00:16:02.104 ], 00:16:02.104 "driver_specific": {} 00:16:02.104 } 00:16:02.104 ] 00:16:02.104 20:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:02.104 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:02.104 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:02.104 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:02.364 [2024-07-15 20:29:54.533527] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:02.364 [2024-07-15 20:29:54.533565] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:02.364 [2024-07-15 20:29:54.533583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:02.364 [2024-07-15 20:29:54.534889] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.364 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.623 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.623 "name": "Existed_Raid", 00:16:02.623 "uuid": "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0", 00:16:02.623 "strip_size_kb": 64, 00:16:02.623 "state": "configuring", 00:16:02.623 "raid_level": "concat", 00:16:02.623 "superblock": true, 00:16:02.623 "num_base_bdevs": 3, 00:16:02.623 "num_base_bdevs_discovered": 2, 00:16:02.623 "num_base_bdevs_operational": 3, 00:16:02.623 "base_bdevs_list": [ 00:16:02.623 { 00:16:02.623 "name": "BaseBdev1", 00:16:02.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.623 "is_configured": false, 00:16:02.623 "data_offset": 0, 00:16:02.623 "data_size": 0 00:16:02.623 }, 00:16:02.623 { 00:16:02.623 "name": "BaseBdev2", 00:16:02.623 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:02.623 "is_configured": true, 00:16:02.623 "data_offset": 2048, 00:16:02.623 "data_size": 63488 00:16:02.623 }, 00:16:02.623 { 00:16:02.623 "name": "BaseBdev3", 00:16:02.623 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:02.623 "is_configured": true, 00:16:02.623 "data_offset": 2048, 00:16:02.623 "data_size": 63488 00:16:02.623 } 00:16:02.623 ] 00:16:02.623 }' 00:16:02.623 20:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.623 20:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:03.191 [2024-07-15 20:29:55.540173] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.191 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.449 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.449 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.450 "name": "Existed_Raid", 00:16:03.450 "uuid": "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0", 00:16:03.450 "strip_size_kb": 64, 00:16:03.450 "state": "configuring", 00:16:03.450 "raid_level": "concat", 00:16:03.450 "superblock": true, 00:16:03.450 "num_base_bdevs": 3, 00:16:03.450 "num_base_bdevs_discovered": 1, 00:16:03.450 "num_base_bdevs_operational": 3, 00:16:03.450 "base_bdevs_list": [ 00:16:03.450 { 00:16:03.450 "name": "BaseBdev1", 00:16:03.450 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.450 "is_configured": false, 00:16:03.450 "data_offset": 0, 00:16:03.450 "data_size": 0 00:16:03.450 }, 00:16:03.450 { 00:16:03.450 "name": null, 00:16:03.450 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:03.450 "is_configured": false, 00:16:03.450 "data_offset": 2048, 00:16:03.450 "data_size": 63488 00:16:03.450 }, 00:16:03.450 { 00:16:03.450 "name": "BaseBdev3", 00:16:03.450 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:03.450 "is_configured": true, 00:16:03.450 "data_offset": 2048, 00:16:03.450 "data_size": 63488 00:16:03.450 } 00:16:03.450 ] 00:16:03.450 }' 00:16:03.450 20:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.450 20:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:04.387 20:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.387 20:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:04.387 20:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:04.387 20:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:04.645 [2024-07-15 20:29:56.887356] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:04.645 BaseBdev1 00:16:04.645 20:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:04.645 20:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:04.645 20:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:04.645 20:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:04.645 20:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:04.645 20:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:04.645 20:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:04.903 20:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:05.161 [ 00:16:05.161 { 00:16:05.161 "name": "BaseBdev1", 00:16:05.161 "aliases": [ 00:16:05.161 "8cbaad02-8b60-48b5-bf3b-27ef693685ba" 00:16:05.161 ], 00:16:05.161 "product_name": "Malloc disk", 00:16:05.161 "block_size": 512, 00:16:05.162 "num_blocks": 65536, 00:16:05.162 "uuid": "8cbaad02-8b60-48b5-bf3b-27ef693685ba", 00:16:05.162 "assigned_rate_limits": { 00:16:05.162 "rw_ios_per_sec": 0, 00:16:05.162 "rw_mbytes_per_sec": 0, 00:16:05.162 "r_mbytes_per_sec": 0, 00:16:05.162 "w_mbytes_per_sec": 0 00:16:05.162 }, 00:16:05.162 "claimed": true, 00:16:05.162 "claim_type": "exclusive_write", 00:16:05.162 "zoned": false, 00:16:05.162 "supported_io_types": { 00:16:05.162 "read": true, 00:16:05.162 "write": true, 00:16:05.162 "unmap": true, 00:16:05.162 "flush": true, 00:16:05.162 "reset": true, 00:16:05.162 "nvme_admin": false, 00:16:05.162 "nvme_io": false, 00:16:05.162 "nvme_io_md": false, 00:16:05.162 "write_zeroes": true, 00:16:05.162 "zcopy": true, 00:16:05.162 "get_zone_info": false, 00:16:05.162 "zone_management": false, 00:16:05.162 "zone_append": false, 00:16:05.162 "compare": false, 00:16:05.162 "compare_and_write": false, 00:16:05.162 "abort": true, 00:16:05.162 "seek_hole": false, 00:16:05.162 "seek_data": false, 00:16:05.162 "copy": true, 00:16:05.162 "nvme_iov_md": false 00:16:05.162 }, 00:16:05.162 "memory_domains": [ 00:16:05.162 { 00:16:05.162 "dma_device_id": "system", 00:16:05.162 "dma_device_type": 1 00:16:05.162 }, 00:16:05.162 { 00:16:05.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.162 "dma_device_type": 2 00:16:05.162 } 00:16:05.162 ], 00:16:05.162 "driver_specific": {} 00:16:05.162 } 00:16:05.162 ] 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.162 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.429 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.429 "name": "Existed_Raid", 00:16:05.429 "uuid": "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0", 00:16:05.429 "strip_size_kb": 64, 00:16:05.429 "state": "configuring", 00:16:05.429 "raid_level": "concat", 00:16:05.429 "superblock": true, 00:16:05.429 "num_base_bdevs": 3, 00:16:05.429 "num_base_bdevs_discovered": 2, 00:16:05.429 "num_base_bdevs_operational": 3, 00:16:05.429 "base_bdevs_list": [ 00:16:05.429 { 00:16:05.429 "name": "BaseBdev1", 00:16:05.429 "uuid": "8cbaad02-8b60-48b5-bf3b-27ef693685ba", 00:16:05.429 "is_configured": true, 00:16:05.429 "data_offset": 2048, 00:16:05.429 "data_size": 63488 00:16:05.429 }, 00:16:05.429 { 00:16:05.429 "name": null, 00:16:05.429 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:05.429 "is_configured": false, 00:16:05.429 "data_offset": 2048, 00:16:05.429 "data_size": 63488 00:16:05.429 }, 00:16:05.429 { 00:16:05.429 "name": "BaseBdev3", 00:16:05.429 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:05.429 "is_configured": true, 00:16:05.429 "data_offset": 2048, 00:16:05.429 "data_size": 63488 00:16:05.429 } 00:16:05.429 ] 00:16:05.429 }' 00:16:05.429 20:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.429 20:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:06.000 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.000 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:06.259 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:06.259 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:06.519 [2024-07-15 20:29:58.672223] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.519 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.778 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.778 "name": "Existed_Raid", 00:16:06.778 "uuid": "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0", 00:16:06.778 "strip_size_kb": 64, 00:16:06.778 "state": "configuring", 00:16:06.778 "raid_level": "concat", 00:16:06.778 "superblock": true, 00:16:06.778 "num_base_bdevs": 3, 00:16:06.778 "num_base_bdevs_discovered": 1, 00:16:06.778 "num_base_bdevs_operational": 3, 00:16:06.778 "base_bdevs_list": [ 00:16:06.778 { 00:16:06.778 "name": "BaseBdev1", 00:16:06.778 "uuid": "8cbaad02-8b60-48b5-bf3b-27ef693685ba", 00:16:06.778 "is_configured": true, 00:16:06.778 "data_offset": 2048, 00:16:06.778 "data_size": 63488 00:16:06.778 }, 00:16:06.778 { 00:16:06.778 "name": null, 00:16:06.778 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:06.778 "is_configured": false, 00:16:06.778 "data_offset": 2048, 00:16:06.778 "data_size": 63488 00:16:06.778 }, 00:16:06.778 { 00:16:06.778 "name": null, 00:16:06.778 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:06.778 "is_configured": false, 00:16:06.778 "data_offset": 2048, 00:16:06.778 "data_size": 63488 00:16:06.778 } 00:16:06.778 ] 00:16:06.778 }' 00:16:06.778 20:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.778 20:29:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:07.346 20:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.346 20:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:07.605 20:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:07.605 20:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:07.864 [2024-07-15 20:30:00.003781] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.864 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.123 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.123 "name": "Existed_Raid", 00:16:08.123 "uuid": "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0", 00:16:08.123 "strip_size_kb": 64, 00:16:08.123 "state": "configuring", 00:16:08.123 "raid_level": "concat", 00:16:08.123 "superblock": true, 00:16:08.123 "num_base_bdevs": 3, 00:16:08.123 "num_base_bdevs_discovered": 2, 00:16:08.123 "num_base_bdevs_operational": 3, 00:16:08.123 "base_bdevs_list": [ 00:16:08.123 { 00:16:08.123 "name": "BaseBdev1", 00:16:08.123 "uuid": "8cbaad02-8b60-48b5-bf3b-27ef693685ba", 00:16:08.123 "is_configured": true, 00:16:08.123 "data_offset": 2048, 00:16:08.123 "data_size": 63488 00:16:08.123 }, 00:16:08.123 { 00:16:08.123 "name": null, 00:16:08.123 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:08.123 "is_configured": false, 00:16:08.123 "data_offset": 2048, 00:16:08.123 "data_size": 63488 00:16:08.123 }, 00:16:08.123 { 00:16:08.123 "name": "BaseBdev3", 00:16:08.123 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:08.123 "is_configured": true, 00:16:08.123 "data_offset": 2048, 00:16:08.123 "data_size": 63488 00:16:08.123 } 00:16:08.123 ] 00:16:08.123 }' 00:16:08.123 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.123 20:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:08.690 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.690 20:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:08.949 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:08.949 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:09.208 [2024-07-15 20:30:01.351403] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.208 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.466 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.466 "name": "Existed_Raid", 00:16:09.466 "uuid": "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0", 00:16:09.466 "strip_size_kb": 64, 00:16:09.466 "state": "configuring", 00:16:09.466 "raid_level": "concat", 00:16:09.466 "superblock": true, 00:16:09.466 "num_base_bdevs": 3, 00:16:09.466 "num_base_bdevs_discovered": 1, 00:16:09.466 "num_base_bdevs_operational": 3, 00:16:09.466 "base_bdevs_list": [ 00:16:09.466 { 00:16:09.466 "name": null, 00:16:09.466 "uuid": "8cbaad02-8b60-48b5-bf3b-27ef693685ba", 00:16:09.467 "is_configured": false, 00:16:09.467 "data_offset": 2048, 00:16:09.467 "data_size": 63488 00:16:09.467 }, 00:16:09.467 { 00:16:09.467 "name": null, 00:16:09.467 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:09.467 "is_configured": false, 00:16:09.467 "data_offset": 2048, 00:16:09.467 "data_size": 63488 00:16:09.467 }, 00:16:09.467 { 00:16:09.467 "name": "BaseBdev3", 00:16:09.467 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:09.467 "is_configured": true, 00:16:09.467 "data_offset": 2048, 00:16:09.467 "data_size": 63488 00:16:09.467 } 00:16:09.467 ] 00:16:09.467 }' 00:16:09.467 20:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.467 20:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:10.034 20:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.034 20:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:10.603 20:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:10.603 20:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:10.862 [2024-07-15 20:30:03.084230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.862 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.427 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.427 "name": "Existed_Raid", 00:16:11.427 "uuid": "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0", 00:16:11.427 "strip_size_kb": 64, 00:16:11.427 "state": "configuring", 00:16:11.427 "raid_level": "concat", 00:16:11.427 "superblock": true, 00:16:11.427 "num_base_bdevs": 3, 00:16:11.427 "num_base_bdevs_discovered": 2, 00:16:11.427 "num_base_bdevs_operational": 3, 00:16:11.427 "base_bdevs_list": [ 00:16:11.427 { 00:16:11.427 "name": null, 00:16:11.427 "uuid": "8cbaad02-8b60-48b5-bf3b-27ef693685ba", 00:16:11.427 "is_configured": false, 00:16:11.427 "data_offset": 2048, 00:16:11.427 "data_size": 63488 00:16:11.427 }, 00:16:11.427 { 00:16:11.427 "name": "BaseBdev2", 00:16:11.427 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:11.427 "is_configured": true, 00:16:11.427 "data_offset": 2048, 00:16:11.427 "data_size": 63488 00:16:11.427 }, 00:16:11.427 { 00:16:11.427 "name": "BaseBdev3", 00:16:11.427 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:11.427 "is_configured": true, 00:16:11.427 "data_offset": 2048, 00:16:11.427 "data_size": 63488 00:16:11.427 } 00:16:11.427 ] 00:16:11.427 }' 00:16:11.427 20:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.427 20:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:11.993 20:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.993 20:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:12.252 20:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:12.252 20:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:12.252 20:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.546 20:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8cbaad02-8b60-48b5-bf3b-27ef693685ba 00:16:12.805 [2024-07-15 20:30:05.037946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:12.805 [2024-07-15 20:30:05.038107] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x159ff50 00:16:12.805 [2024-07-15 20:30:05.038121] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:12.805 [2024-07-15 20:30:05.038294] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a6940 00:16:12.805 [2024-07-15 20:30:05.038407] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x159ff50 00:16:12.805 [2024-07-15 20:30:05.038417] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x159ff50 00:16:12.805 [2024-07-15 20:30:05.038508] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:12.805 NewBaseBdev 00:16:12.805 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:12.805 20:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:12.805 20:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:12.805 20:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:12.805 20:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:12.805 20:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:12.805 20:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:13.116 20:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:13.389 [ 00:16:13.389 { 00:16:13.389 "name": "NewBaseBdev", 00:16:13.389 "aliases": [ 00:16:13.389 "8cbaad02-8b60-48b5-bf3b-27ef693685ba" 00:16:13.389 ], 00:16:13.389 "product_name": "Malloc disk", 00:16:13.389 "block_size": 512, 00:16:13.389 "num_blocks": 65536, 00:16:13.389 "uuid": "8cbaad02-8b60-48b5-bf3b-27ef693685ba", 00:16:13.389 "assigned_rate_limits": { 00:16:13.389 "rw_ios_per_sec": 0, 00:16:13.389 "rw_mbytes_per_sec": 0, 00:16:13.389 "r_mbytes_per_sec": 0, 00:16:13.389 "w_mbytes_per_sec": 0 00:16:13.389 }, 00:16:13.389 "claimed": true, 00:16:13.389 "claim_type": "exclusive_write", 00:16:13.389 "zoned": false, 00:16:13.389 "supported_io_types": { 00:16:13.389 "read": true, 00:16:13.389 "write": true, 00:16:13.389 "unmap": true, 00:16:13.389 "flush": true, 00:16:13.389 "reset": true, 00:16:13.389 "nvme_admin": false, 00:16:13.389 "nvme_io": false, 00:16:13.389 "nvme_io_md": false, 00:16:13.389 "write_zeroes": true, 00:16:13.389 "zcopy": true, 00:16:13.389 "get_zone_info": false, 00:16:13.389 "zone_management": false, 00:16:13.389 "zone_append": false, 00:16:13.389 "compare": false, 00:16:13.389 "compare_and_write": false, 00:16:13.389 "abort": true, 00:16:13.389 "seek_hole": false, 00:16:13.389 "seek_data": false, 00:16:13.389 "copy": true, 00:16:13.389 "nvme_iov_md": false 00:16:13.389 }, 00:16:13.389 "memory_domains": [ 00:16:13.389 { 00:16:13.389 "dma_device_id": "system", 00:16:13.389 "dma_device_type": 1 00:16:13.389 }, 00:16:13.389 { 00:16:13.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.389 "dma_device_type": 2 00:16:13.389 } 00:16:13.389 ], 00:16:13.389 "driver_specific": {} 00:16:13.389 } 00:16:13.389 ] 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.389 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.647 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.647 "name": "Existed_Raid", 00:16:13.647 "uuid": "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0", 00:16:13.647 "strip_size_kb": 64, 00:16:13.647 "state": "online", 00:16:13.647 "raid_level": "concat", 00:16:13.647 "superblock": true, 00:16:13.647 "num_base_bdevs": 3, 00:16:13.647 "num_base_bdevs_discovered": 3, 00:16:13.647 "num_base_bdevs_operational": 3, 00:16:13.647 "base_bdevs_list": [ 00:16:13.647 { 00:16:13.647 "name": "NewBaseBdev", 00:16:13.648 "uuid": "8cbaad02-8b60-48b5-bf3b-27ef693685ba", 00:16:13.648 "is_configured": true, 00:16:13.648 "data_offset": 2048, 00:16:13.648 "data_size": 63488 00:16:13.648 }, 00:16:13.648 { 00:16:13.648 "name": "BaseBdev2", 00:16:13.648 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:13.648 "is_configured": true, 00:16:13.648 "data_offset": 2048, 00:16:13.648 "data_size": 63488 00:16:13.648 }, 00:16:13.648 { 00:16:13.648 "name": "BaseBdev3", 00:16:13.648 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:13.648 "is_configured": true, 00:16:13.648 "data_offset": 2048, 00:16:13.648 "data_size": 63488 00:16:13.648 } 00:16:13.648 ] 00:16:13.648 }' 00:16:13.648 20:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.648 20:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:14.214 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:14.214 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:14.214 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:14.214 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:14.214 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:14.214 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:14.214 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:14.214 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:14.474 [2024-07-15 20:30:06.630470] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:14.474 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:14.474 "name": "Existed_Raid", 00:16:14.474 "aliases": [ 00:16:14.474 "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0" 00:16:14.474 ], 00:16:14.474 "product_name": "Raid Volume", 00:16:14.474 "block_size": 512, 00:16:14.474 "num_blocks": 190464, 00:16:14.474 "uuid": "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0", 00:16:14.474 "assigned_rate_limits": { 00:16:14.474 "rw_ios_per_sec": 0, 00:16:14.474 "rw_mbytes_per_sec": 0, 00:16:14.474 "r_mbytes_per_sec": 0, 00:16:14.474 "w_mbytes_per_sec": 0 00:16:14.474 }, 00:16:14.474 "claimed": false, 00:16:14.474 "zoned": false, 00:16:14.474 "supported_io_types": { 00:16:14.474 "read": true, 00:16:14.474 "write": true, 00:16:14.474 "unmap": true, 00:16:14.474 "flush": true, 00:16:14.474 "reset": true, 00:16:14.474 "nvme_admin": false, 00:16:14.474 "nvme_io": false, 00:16:14.474 "nvme_io_md": false, 00:16:14.474 "write_zeroes": true, 00:16:14.474 "zcopy": false, 00:16:14.474 "get_zone_info": false, 00:16:14.474 "zone_management": false, 00:16:14.474 "zone_append": false, 00:16:14.474 "compare": false, 00:16:14.474 "compare_and_write": false, 00:16:14.474 "abort": false, 00:16:14.474 "seek_hole": false, 00:16:14.474 "seek_data": false, 00:16:14.474 "copy": false, 00:16:14.474 "nvme_iov_md": false 00:16:14.474 }, 00:16:14.474 "memory_domains": [ 00:16:14.474 { 00:16:14.474 "dma_device_id": "system", 00:16:14.474 "dma_device_type": 1 00:16:14.474 }, 00:16:14.474 { 00:16:14.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.474 "dma_device_type": 2 00:16:14.474 }, 00:16:14.474 { 00:16:14.474 "dma_device_id": "system", 00:16:14.474 "dma_device_type": 1 00:16:14.474 }, 00:16:14.474 { 00:16:14.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.474 "dma_device_type": 2 00:16:14.474 }, 00:16:14.474 { 00:16:14.474 "dma_device_id": "system", 00:16:14.474 "dma_device_type": 1 00:16:14.474 }, 00:16:14.474 { 00:16:14.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.474 "dma_device_type": 2 00:16:14.474 } 00:16:14.474 ], 00:16:14.474 "driver_specific": { 00:16:14.474 "raid": { 00:16:14.474 "uuid": "0c9cbe0d-6a69-41a2-bc78-86d291df5fc0", 00:16:14.474 "strip_size_kb": 64, 00:16:14.474 "state": "online", 00:16:14.474 "raid_level": "concat", 00:16:14.474 "superblock": true, 00:16:14.474 "num_base_bdevs": 3, 00:16:14.474 "num_base_bdevs_discovered": 3, 00:16:14.474 "num_base_bdevs_operational": 3, 00:16:14.474 "base_bdevs_list": [ 00:16:14.474 { 00:16:14.474 "name": "NewBaseBdev", 00:16:14.474 "uuid": "8cbaad02-8b60-48b5-bf3b-27ef693685ba", 00:16:14.474 "is_configured": true, 00:16:14.474 "data_offset": 2048, 00:16:14.474 "data_size": 63488 00:16:14.474 }, 00:16:14.474 { 00:16:14.474 "name": "BaseBdev2", 00:16:14.474 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:14.474 "is_configured": true, 00:16:14.474 "data_offset": 2048, 00:16:14.474 "data_size": 63488 00:16:14.474 }, 00:16:14.474 { 00:16:14.474 "name": "BaseBdev3", 00:16:14.474 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:14.474 "is_configured": true, 00:16:14.474 "data_offset": 2048, 00:16:14.474 "data_size": 63488 00:16:14.474 } 00:16:14.474 ] 00:16:14.474 } 00:16:14.474 } 00:16:14.474 }' 00:16:14.474 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:14.474 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:14.474 BaseBdev2 00:16:14.474 BaseBdev3' 00:16:14.474 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:14.474 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:14.474 20:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:14.734 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:14.734 "name": "NewBaseBdev", 00:16:14.734 "aliases": [ 00:16:14.734 "8cbaad02-8b60-48b5-bf3b-27ef693685ba" 00:16:14.734 ], 00:16:14.734 "product_name": "Malloc disk", 00:16:14.734 "block_size": 512, 00:16:14.734 "num_blocks": 65536, 00:16:14.734 "uuid": "8cbaad02-8b60-48b5-bf3b-27ef693685ba", 00:16:14.734 "assigned_rate_limits": { 00:16:14.734 "rw_ios_per_sec": 0, 00:16:14.734 "rw_mbytes_per_sec": 0, 00:16:14.734 "r_mbytes_per_sec": 0, 00:16:14.734 "w_mbytes_per_sec": 0 00:16:14.734 }, 00:16:14.734 "claimed": true, 00:16:14.734 "claim_type": "exclusive_write", 00:16:14.734 "zoned": false, 00:16:14.734 "supported_io_types": { 00:16:14.734 "read": true, 00:16:14.734 "write": true, 00:16:14.734 "unmap": true, 00:16:14.734 "flush": true, 00:16:14.734 "reset": true, 00:16:14.734 "nvme_admin": false, 00:16:14.734 "nvme_io": false, 00:16:14.734 "nvme_io_md": false, 00:16:14.734 "write_zeroes": true, 00:16:14.734 "zcopy": true, 00:16:14.734 "get_zone_info": false, 00:16:14.734 "zone_management": false, 00:16:14.734 "zone_append": false, 00:16:14.734 "compare": false, 00:16:14.734 "compare_and_write": false, 00:16:14.734 "abort": true, 00:16:14.734 "seek_hole": false, 00:16:14.734 "seek_data": false, 00:16:14.734 "copy": true, 00:16:14.734 "nvme_iov_md": false 00:16:14.734 }, 00:16:14.734 "memory_domains": [ 00:16:14.734 { 00:16:14.734 "dma_device_id": "system", 00:16:14.734 "dma_device_type": 1 00:16:14.734 }, 00:16:14.734 { 00:16:14.734 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.734 "dma_device_type": 2 00:16:14.734 } 00:16:14.734 ], 00:16:14.734 "driver_specific": {} 00:16:14.734 }' 00:16:14.734 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.734 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.993 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:14.993 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.993 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.993 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:14.993 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:14.993 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:14.993 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:14.993 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:14.993 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.252 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:15.252 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:15.252 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:15.252 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:15.510 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:15.510 "name": "BaseBdev2", 00:16:15.510 "aliases": [ 00:16:15.510 "ce4ca2b9-775a-46ac-ad97-5e3980c4194c" 00:16:15.510 ], 00:16:15.510 "product_name": "Malloc disk", 00:16:15.510 "block_size": 512, 00:16:15.510 "num_blocks": 65536, 00:16:15.510 "uuid": "ce4ca2b9-775a-46ac-ad97-5e3980c4194c", 00:16:15.510 "assigned_rate_limits": { 00:16:15.510 "rw_ios_per_sec": 0, 00:16:15.510 "rw_mbytes_per_sec": 0, 00:16:15.511 "r_mbytes_per_sec": 0, 00:16:15.511 "w_mbytes_per_sec": 0 00:16:15.511 }, 00:16:15.511 "claimed": true, 00:16:15.511 "claim_type": "exclusive_write", 00:16:15.511 "zoned": false, 00:16:15.511 "supported_io_types": { 00:16:15.511 "read": true, 00:16:15.511 "write": true, 00:16:15.511 "unmap": true, 00:16:15.511 "flush": true, 00:16:15.511 "reset": true, 00:16:15.511 "nvme_admin": false, 00:16:15.511 "nvme_io": false, 00:16:15.511 "nvme_io_md": false, 00:16:15.511 "write_zeroes": true, 00:16:15.511 "zcopy": true, 00:16:15.511 "get_zone_info": false, 00:16:15.511 "zone_management": false, 00:16:15.511 "zone_append": false, 00:16:15.511 "compare": false, 00:16:15.511 "compare_and_write": false, 00:16:15.511 "abort": true, 00:16:15.511 "seek_hole": false, 00:16:15.511 "seek_data": false, 00:16:15.511 "copy": true, 00:16:15.511 "nvme_iov_md": false 00:16:15.511 }, 00:16:15.511 "memory_domains": [ 00:16:15.511 { 00:16:15.511 "dma_device_id": "system", 00:16:15.511 "dma_device_type": 1 00:16:15.511 }, 00:16:15.511 { 00:16:15.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.511 "dma_device_type": 2 00:16:15.511 } 00:16:15.511 ], 00:16:15.511 "driver_specific": {} 00:16:15.511 }' 00:16:15.511 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.511 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.511 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:15.511 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.511 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.511 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:15.511 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:15.511 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:15.770 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:15.770 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.770 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.770 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:15.770 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:15.770 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:15.770 20:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:16.030 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:16.030 "name": "BaseBdev3", 00:16:16.030 "aliases": [ 00:16:16.030 "4a1b6099-8b27-488b-8258-c841733c1940" 00:16:16.030 ], 00:16:16.030 "product_name": "Malloc disk", 00:16:16.030 "block_size": 512, 00:16:16.030 "num_blocks": 65536, 00:16:16.030 "uuid": "4a1b6099-8b27-488b-8258-c841733c1940", 00:16:16.030 "assigned_rate_limits": { 00:16:16.030 "rw_ios_per_sec": 0, 00:16:16.030 "rw_mbytes_per_sec": 0, 00:16:16.030 "r_mbytes_per_sec": 0, 00:16:16.030 "w_mbytes_per_sec": 0 00:16:16.030 }, 00:16:16.030 "claimed": true, 00:16:16.030 "claim_type": "exclusive_write", 00:16:16.030 "zoned": false, 00:16:16.030 "supported_io_types": { 00:16:16.030 "read": true, 00:16:16.030 "write": true, 00:16:16.030 "unmap": true, 00:16:16.030 "flush": true, 00:16:16.030 "reset": true, 00:16:16.030 "nvme_admin": false, 00:16:16.030 "nvme_io": false, 00:16:16.030 "nvme_io_md": false, 00:16:16.030 "write_zeroes": true, 00:16:16.030 "zcopy": true, 00:16:16.030 "get_zone_info": false, 00:16:16.030 "zone_management": false, 00:16:16.030 "zone_append": false, 00:16:16.030 "compare": false, 00:16:16.030 "compare_and_write": false, 00:16:16.030 "abort": true, 00:16:16.030 "seek_hole": false, 00:16:16.030 "seek_data": false, 00:16:16.030 "copy": true, 00:16:16.030 "nvme_iov_md": false 00:16:16.030 }, 00:16:16.030 "memory_domains": [ 00:16:16.030 { 00:16:16.030 "dma_device_id": "system", 00:16:16.030 "dma_device_type": 1 00:16:16.030 }, 00:16:16.030 { 00:16:16.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.030 "dma_device_type": 2 00:16:16.030 } 00:16:16.030 ], 00:16:16.030 "driver_specific": {} 00:16:16.030 }' 00:16:16.030 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:16.030 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:16.030 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:16.030 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:16.030 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:16.290 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:16.291 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:16.291 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:16.291 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:16.291 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:16.291 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:16.291 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:16.291 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:16.550 [2024-07-15 20:30:08.844074] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:16.550 [2024-07-15 20:30:08.844100] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:16.550 [2024-07-15 20:30:08.844149] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:16.550 [2024-07-15 20:30:08.844200] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:16.550 [2024-07-15 20:30:08.844212] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x159ff50 name Existed_Raid, state offline 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1388917 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1388917 ']' 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1388917 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1388917 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1388917' 00:16:16.550 killing process with pid 1388917 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1388917 00:16:16.550 [2024-07-15 20:30:08.928331] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:16.550 20:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1388917 00:16:16.810 [2024-07-15 20:30:08.956145] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:16.810 20:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:16.810 00:16:16.810 real 0m28.770s 00:16:16.810 user 0m53.315s 00:16:16.810 sys 0m5.200s 00:16:16.810 20:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:16.810 20:30:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:16.810 ************************************ 00:16:16.810 END TEST raid_state_function_test_sb 00:16:16.810 ************************************ 00:16:17.069 20:30:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:17.069 20:30:09 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:16:17.069 20:30:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:17.069 20:30:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:17.069 20:30:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:17.069 ************************************ 00:16:17.069 START TEST raid_superblock_test 00:16:17.069 ************************************ 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1393708 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1393708 /var/tmp/spdk-raid.sock 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1393708 ']' 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:17.069 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:17.069 20:30:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:17.069 [2024-07-15 20:30:09.327455] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:16:17.070 [2024-07-15 20:30:09.327535] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1393708 ] 00:16:17.328 [2024-07-15 20:30:09.461012] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:17.329 [2024-07-15 20:30:09.562857] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:17.329 [2024-07-15 20:30:09.625460] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:17.329 [2024-07-15 20:30:09.625505] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:17.895 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:18.153 malloc1 00:16:18.153 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:18.412 [2024-07-15 20:30:10.740155] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:18.412 [2024-07-15 20:30:10.740204] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:18.412 [2024-07-15 20:30:10.740226] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x107b570 00:16:18.412 [2024-07-15 20:30:10.740238] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:18.412 [2024-07-15 20:30:10.741834] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:18.412 [2024-07-15 20:30:10.741865] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:18.412 pt1 00:16:18.412 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:18.412 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:18.412 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:18.412 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:18.412 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:18.412 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:18.412 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:18.412 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:18.412 20:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:18.671 malloc2 00:16:18.671 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:18.930 [2024-07-15 20:30:11.242203] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:18.930 [2024-07-15 20:30:11.242248] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:18.930 [2024-07-15 20:30:11.242273] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x107c970 00:16:18.930 [2024-07-15 20:30:11.242285] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:18.930 [2024-07-15 20:30:11.243820] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:18.930 [2024-07-15 20:30:11.243850] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:18.930 pt2 00:16:18.930 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:18.930 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:18.930 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:18.930 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:18.930 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:18.930 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:18.930 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:18.930 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:18.930 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:19.190 malloc3 00:16:19.190 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:19.449 [2024-07-15 20:30:11.744125] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:19.449 [2024-07-15 20:30:11.744172] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:19.449 [2024-07-15 20:30:11.744190] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1213340 00:16:19.449 [2024-07-15 20:30:11.744202] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:19.449 [2024-07-15 20:30:11.745620] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:19.449 [2024-07-15 20:30:11.745649] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:19.449 pt3 00:16:19.449 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:19.449 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:19.449 20:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:19.708 [2024-07-15 20:30:12.004838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:19.708 [2024-07-15 20:30:12.006091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:19.708 [2024-07-15 20:30:12.006147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:19.708 [2024-07-15 20:30:12.006297] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1073ea0 00:16:19.708 [2024-07-15 20:30:12.006309] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:19.708 [2024-07-15 20:30:12.006509] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x107b240 00:16:19.708 [2024-07-15 20:30:12.006651] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1073ea0 00:16:19.708 [2024-07-15 20:30:12.006661] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1073ea0 00:16:19.708 [2024-07-15 20:30:12.006754] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.708 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:19.967 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.967 "name": "raid_bdev1", 00:16:19.967 "uuid": "3fdffc1d-d837-415a-879d-6e47b01bb8e2", 00:16:19.967 "strip_size_kb": 64, 00:16:19.967 "state": "online", 00:16:19.967 "raid_level": "concat", 00:16:19.967 "superblock": true, 00:16:19.967 "num_base_bdevs": 3, 00:16:19.967 "num_base_bdevs_discovered": 3, 00:16:19.967 "num_base_bdevs_operational": 3, 00:16:19.967 "base_bdevs_list": [ 00:16:19.967 { 00:16:19.967 "name": "pt1", 00:16:19.967 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:19.967 "is_configured": true, 00:16:19.967 "data_offset": 2048, 00:16:19.967 "data_size": 63488 00:16:19.967 }, 00:16:19.967 { 00:16:19.967 "name": "pt2", 00:16:19.967 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:19.967 "is_configured": true, 00:16:19.967 "data_offset": 2048, 00:16:19.967 "data_size": 63488 00:16:19.967 }, 00:16:19.967 { 00:16:19.967 "name": "pt3", 00:16:19.967 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:19.967 "is_configured": true, 00:16:19.967 "data_offset": 2048, 00:16:19.967 "data_size": 63488 00:16:19.967 } 00:16:19.967 ] 00:16:19.967 }' 00:16:19.967 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.967 20:30:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.535 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:20.535 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:20.535 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:20.535 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:20.535 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:20.535 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:20.535 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:20.535 20:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:20.795 [2024-07-15 20:30:13.035840] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:20.795 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:20.795 "name": "raid_bdev1", 00:16:20.795 "aliases": [ 00:16:20.795 "3fdffc1d-d837-415a-879d-6e47b01bb8e2" 00:16:20.795 ], 00:16:20.795 "product_name": "Raid Volume", 00:16:20.795 "block_size": 512, 00:16:20.795 "num_blocks": 190464, 00:16:20.795 "uuid": "3fdffc1d-d837-415a-879d-6e47b01bb8e2", 00:16:20.795 "assigned_rate_limits": { 00:16:20.795 "rw_ios_per_sec": 0, 00:16:20.795 "rw_mbytes_per_sec": 0, 00:16:20.795 "r_mbytes_per_sec": 0, 00:16:20.795 "w_mbytes_per_sec": 0 00:16:20.795 }, 00:16:20.795 "claimed": false, 00:16:20.795 "zoned": false, 00:16:20.795 "supported_io_types": { 00:16:20.795 "read": true, 00:16:20.795 "write": true, 00:16:20.795 "unmap": true, 00:16:20.795 "flush": true, 00:16:20.795 "reset": true, 00:16:20.795 "nvme_admin": false, 00:16:20.795 "nvme_io": false, 00:16:20.795 "nvme_io_md": false, 00:16:20.795 "write_zeroes": true, 00:16:20.795 "zcopy": false, 00:16:20.795 "get_zone_info": false, 00:16:20.795 "zone_management": false, 00:16:20.795 "zone_append": false, 00:16:20.795 "compare": false, 00:16:20.795 "compare_and_write": false, 00:16:20.795 "abort": false, 00:16:20.795 "seek_hole": false, 00:16:20.795 "seek_data": false, 00:16:20.795 "copy": false, 00:16:20.795 "nvme_iov_md": false 00:16:20.795 }, 00:16:20.795 "memory_domains": [ 00:16:20.795 { 00:16:20.795 "dma_device_id": "system", 00:16:20.795 "dma_device_type": 1 00:16:20.795 }, 00:16:20.795 { 00:16:20.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.795 "dma_device_type": 2 00:16:20.795 }, 00:16:20.795 { 00:16:20.795 "dma_device_id": "system", 00:16:20.795 "dma_device_type": 1 00:16:20.795 }, 00:16:20.795 { 00:16:20.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.795 "dma_device_type": 2 00:16:20.795 }, 00:16:20.795 { 00:16:20.795 "dma_device_id": "system", 00:16:20.795 "dma_device_type": 1 00:16:20.795 }, 00:16:20.795 { 00:16:20.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.795 "dma_device_type": 2 00:16:20.795 } 00:16:20.795 ], 00:16:20.795 "driver_specific": { 00:16:20.795 "raid": { 00:16:20.795 "uuid": "3fdffc1d-d837-415a-879d-6e47b01bb8e2", 00:16:20.795 "strip_size_kb": 64, 00:16:20.795 "state": "online", 00:16:20.795 "raid_level": "concat", 00:16:20.795 "superblock": true, 00:16:20.795 "num_base_bdevs": 3, 00:16:20.795 "num_base_bdevs_discovered": 3, 00:16:20.795 "num_base_bdevs_operational": 3, 00:16:20.795 "base_bdevs_list": [ 00:16:20.795 { 00:16:20.795 "name": "pt1", 00:16:20.795 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:20.795 "is_configured": true, 00:16:20.795 "data_offset": 2048, 00:16:20.795 "data_size": 63488 00:16:20.795 }, 00:16:20.795 { 00:16:20.795 "name": "pt2", 00:16:20.795 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:20.795 "is_configured": true, 00:16:20.795 "data_offset": 2048, 00:16:20.795 "data_size": 63488 00:16:20.795 }, 00:16:20.795 { 00:16:20.795 "name": "pt3", 00:16:20.795 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:20.795 "is_configured": true, 00:16:20.795 "data_offset": 2048, 00:16:20.795 "data_size": 63488 00:16:20.795 } 00:16:20.795 ] 00:16:20.795 } 00:16:20.795 } 00:16:20.795 }' 00:16:20.795 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:20.795 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:20.795 pt2 00:16:20.795 pt3' 00:16:20.795 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:20.795 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:20.795 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:21.375 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:21.375 "name": "pt1", 00:16:21.375 "aliases": [ 00:16:21.375 "00000000-0000-0000-0000-000000000001" 00:16:21.375 ], 00:16:21.375 "product_name": "passthru", 00:16:21.375 "block_size": 512, 00:16:21.375 "num_blocks": 65536, 00:16:21.375 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:21.375 "assigned_rate_limits": { 00:16:21.375 "rw_ios_per_sec": 0, 00:16:21.375 "rw_mbytes_per_sec": 0, 00:16:21.375 "r_mbytes_per_sec": 0, 00:16:21.375 "w_mbytes_per_sec": 0 00:16:21.375 }, 00:16:21.375 "claimed": true, 00:16:21.375 "claim_type": "exclusive_write", 00:16:21.376 "zoned": false, 00:16:21.376 "supported_io_types": { 00:16:21.376 "read": true, 00:16:21.376 "write": true, 00:16:21.376 "unmap": true, 00:16:21.376 "flush": true, 00:16:21.376 "reset": true, 00:16:21.376 "nvme_admin": false, 00:16:21.376 "nvme_io": false, 00:16:21.376 "nvme_io_md": false, 00:16:21.376 "write_zeroes": true, 00:16:21.376 "zcopy": true, 00:16:21.376 "get_zone_info": false, 00:16:21.376 "zone_management": false, 00:16:21.376 "zone_append": false, 00:16:21.376 "compare": false, 00:16:21.376 "compare_and_write": false, 00:16:21.376 "abort": true, 00:16:21.376 "seek_hole": false, 00:16:21.376 "seek_data": false, 00:16:21.376 "copy": true, 00:16:21.376 "nvme_iov_md": false 00:16:21.376 }, 00:16:21.376 "memory_domains": [ 00:16:21.376 { 00:16:21.376 "dma_device_id": "system", 00:16:21.376 "dma_device_type": 1 00:16:21.376 }, 00:16:21.376 { 00:16:21.376 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.376 "dma_device_type": 2 00:16:21.376 } 00:16:21.376 ], 00:16:21.376 "driver_specific": { 00:16:21.376 "passthru": { 00:16:21.376 "name": "pt1", 00:16:21.376 "base_bdev_name": "malloc1" 00:16:21.376 } 00:16:21.376 } 00:16:21.376 }' 00:16:21.376 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.376 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.376 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:21.376 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:21.634 20:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:21.893 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:21.893 "name": "pt2", 00:16:21.893 "aliases": [ 00:16:21.893 "00000000-0000-0000-0000-000000000002" 00:16:21.893 ], 00:16:21.893 "product_name": "passthru", 00:16:21.893 "block_size": 512, 00:16:21.893 "num_blocks": 65536, 00:16:21.893 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:21.893 "assigned_rate_limits": { 00:16:21.893 "rw_ios_per_sec": 0, 00:16:21.893 "rw_mbytes_per_sec": 0, 00:16:21.893 "r_mbytes_per_sec": 0, 00:16:21.893 "w_mbytes_per_sec": 0 00:16:21.893 }, 00:16:21.893 "claimed": true, 00:16:21.893 "claim_type": "exclusive_write", 00:16:21.893 "zoned": false, 00:16:21.893 "supported_io_types": { 00:16:21.893 "read": true, 00:16:21.893 "write": true, 00:16:21.893 "unmap": true, 00:16:21.893 "flush": true, 00:16:21.893 "reset": true, 00:16:21.893 "nvme_admin": false, 00:16:21.893 "nvme_io": false, 00:16:21.893 "nvme_io_md": false, 00:16:21.893 "write_zeroes": true, 00:16:21.893 "zcopy": true, 00:16:21.893 "get_zone_info": false, 00:16:21.893 "zone_management": false, 00:16:21.893 "zone_append": false, 00:16:21.893 "compare": false, 00:16:21.893 "compare_and_write": false, 00:16:21.893 "abort": true, 00:16:21.893 "seek_hole": false, 00:16:21.893 "seek_data": false, 00:16:21.893 "copy": true, 00:16:21.893 "nvme_iov_md": false 00:16:21.893 }, 00:16:21.893 "memory_domains": [ 00:16:21.893 { 00:16:21.893 "dma_device_id": "system", 00:16:21.893 "dma_device_type": 1 00:16:21.893 }, 00:16:21.893 { 00:16:21.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.893 "dma_device_type": 2 00:16:21.893 } 00:16:21.893 ], 00:16:21.893 "driver_specific": { 00:16:21.893 "passthru": { 00:16:21.893 "name": "pt2", 00:16:21.893 "base_bdev_name": "malloc2" 00:16:21.893 } 00:16:21.893 } 00:16:21.893 }' 00:16:21.893 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.893 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.893 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:21.893 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:22.152 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:22.411 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:22.411 "name": "pt3", 00:16:22.411 "aliases": [ 00:16:22.411 "00000000-0000-0000-0000-000000000003" 00:16:22.411 ], 00:16:22.411 "product_name": "passthru", 00:16:22.411 "block_size": 512, 00:16:22.411 "num_blocks": 65536, 00:16:22.411 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:22.411 "assigned_rate_limits": { 00:16:22.411 "rw_ios_per_sec": 0, 00:16:22.411 "rw_mbytes_per_sec": 0, 00:16:22.411 "r_mbytes_per_sec": 0, 00:16:22.411 "w_mbytes_per_sec": 0 00:16:22.411 }, 00:16:22.411 "claimed": true, 00:16:22.411 "claim_type": "exclusive_write", 00:16:22.411 "zoned": false, 00:16:22.411 "supported_io_types": { 00:16:22.411 "read": true, 00:16:22.411 "write": true, 00:16:22.411 "unmap": true, 00:16:22.411 "flush": true, 00:16:22.411 "reset": true, 00:16:22.411 "nvme_admin": false, 00:16:22.411 "nvme_io": false, 00:16:22.411 "nvme_io_md": false, 00:16:22.411 "write_zeroes": true, 00:16:22.411 "zcopy": true, 00:16:22.411 "get_zone_info": false, 00:16:22.411 "zone_management": false, 00:16:22.411 "zone_append": false, 00:16:22.411 "compare": false, 00:16:22.411 "compare_and_write": false, 00:16:22.411 "abort": true, 00:16:22.411 "seek_hole": false, 00:16:22.411 "seek_data": false, 00:16:22.411 "copy": true, 00:16:22.411 "nvme_iov_md": false 00:16:22.411 }, 00:16:22.411 "memory_domains": [ 00:16:22.411 { 00:16:22.411 "dma_device_id": "system", 00:16:22.411 "dma_device_type": 1 00:16:22.411 }, 00:16:22.411 { 00:16:22.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.411 "dma_device_type": 2 00:16:22.411 } 00:16:22.411 ], 00:16:22.411 "driver_specific": { 00:16:22.411 "passthru": { 00:16:22.411 "name": "pt3", 00:16:22.411 "base_bdev_name": "malloc3" 00:16:22.411 } 00:16:22.411 } 00:16:22.411 }' 00:16:22.411 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.670 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.670 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:22.670 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.670 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.670 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:22.670 20:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.670 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.929 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:22.929 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.929 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.929 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:22.929 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:22.929 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:23.188 [2024-07-15 20:30:15.349989] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:23.188 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=3fdffc1d-d837-415a-879d-6e47b01bb8e2 00:16:23.188 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 3fdffc1d-d837-415a-879d-6e47b01bb8e2 ']' 00:16:23.188 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:23.446 [2024-07-15 20:30:15.586341] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:23.446 [2024-07-15 20:30:15.586362] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:23.446 [2024-07-15 20:30:15.586412] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:23.446 [2024-07-15 20:30:15.586464] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:23.446 [2024-07-15 20:30:15.586476] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1073ea0 name raid_bdev1, state offline 00:16:23.446 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.446 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:23.703 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:23.703 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:23.703 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:23.703 20:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:23.703 20:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:23.703 20:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:23.961 20:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:23.961 20:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:24.218 20:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:24.218 20:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:24.476 20:30:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:25.041 [2024-07-15 20:30:17.290803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:25.041 [2024-07-15 20:30:17.292183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:25.041 [2024-07-15 20:30:17.292226] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:25.041 [2024-07-15 20:30:17.292273] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:25.041 [2024-07-15 20:30:17.292312] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:25.041 [2024-07-15 20:30:17.292335] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:25.041 [2024-07-15 20:30:17.292353] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:25.041 [2024-07-15 20:30:17.292363] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x121eff0 name raid_bdev1, state configuring 00:16:25.041 request: 00:16:25.041 { 00:16:25.041 "name": "raid_bdev1", 00:16:25.041 "raid_level": "concat", 00:16:25.041 "base_bdevs": [ 00:16:25.041 "malloc1", 00:16:25.041 "malloc2", 00:16:25.041 "malloc3" 00:16:25.041 ], 00:16:25.041 "strip_size_kb": 64, 00:16:25.041 "superblock": false, 00:16:25.041 "method": "bdev_raid_create", 00:16:25.041 "req_id": 1 00:16:25.041 } 00:16:25.041 Got JSON-RPC error response 00:16:25.041 response: 00:16:25.041 { 00:16:25.041 "code": -17, 00:16:25.041 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:25.041 } 00:16:25.041 20:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:25.041 20:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:25.041 20:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:25.041 20:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:25.041 20:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.041 20:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:25.300 20:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:25.300 20:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:25.300 20:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:25.869 [2024-07-15 20:30:18.052747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:25.869 [2024-07-15 20:30:18.052804] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:25.869 [2024-07-15 20:30:18.052829] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x107b7a0 00:16:25.869 [2024-07-15 20:30:18.052842] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:25.869 [2024-07-15 20:30:18.054600] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:25.869 [2024-07-15 20:30:18.054633] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:25.869 [2024-07-15 20:30:18.054708] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:25.869 [2024-07-15 20:30:18.054737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:25.869 pt1 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.869 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:26.128 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.128 "name": "raid_bdev1", 00:16:26.128 "uuid": "3fdffc1d-d837-415a-879d-6e47b01bb8e2", 00:16:26.128 "strip_size_kb": 64, 00:16:26.128 "state": "configuring", 00:16:26.128 "raid_level": "concat", 00:16:26.128 "superblock": true, 00:16:26.128 "num_base_bdevs": 3, 00:16:26.128 "num_base_bdevs_discovered": 1, 00:16:26.128 "num_base_bdevs_operational": 3, 00:16:26.128 "base_bdevs_list": [ 00:16:26.128 { 00:16:26.128 "name": "pt1", 00:16:26.128 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:26.128 "is_configured": true, 00:16:26.128 "data_offset": 2048, 00:16:26.128 "data_size": 63488 00:16:26.128 }, 00:16:26.128 { 00:16:26.128 "name": null, 00:16:26.128 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:26.128 "is_configured": false, 00:16:26.128 "data_offset": 2048, 00:16:26.128 "data_size": 63488 00:16:26.128 }, 00:16:26.128 { 00:16:26.128 "name": null, 00:16:26.128 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:26.128 "is_configured": false, 00:16:26.128 "data_offset": 2048, 00:16:26.128 "data_size": 63488 00:16:26.128 } 00:16:26.128 ] 00:16:26.128 }' 00:16:26.129 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.129 20:30:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.697 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:16:26.697 20:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:26.956 [2024-07-15 20:30:19.091496] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:26.956 [2024-07-15 20:30:19.091551] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:26.956 [2024-07-15 20:30:19.091571] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1072c70 00:16:26.956 [2024-07-15 20:30:19.091584] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:26.956 [2024-07-15 20:30:19.091958] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:26.956 [2024-07-15 20:30:19.091980] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:26.956 [2024-07-15 20:30:19.092046] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:26.956 [2024-07-15 20:30:19.092067] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:26.956 pt2 00:16:26.956 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:27.215 [2024-07-15 20:30:19.340174] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.215 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:27.473 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.473 "name": "raid_bdev1", 00:16:27.473 "uuid": "3fdffc1d-d837-415a-879d-6e47b01bb8e2", 00:16:27.473 "strip_size_kb": 64, 00:16:27.473 "state": "configuring", 00:16:27.473 "raid_level": "concat", 00:16:27.473 "superblock": true, 00:16:27.473 "num_base_bdevs": 3, 00:16:27.473 "num_base_bdevs_discovered": 1, 00:16:27.473 "num_base_bdevs_operational": 3, 00:16:27.473 "base_bdevs_list": [ 00:16:27.473 { 00:16:27.473 "name": "pt1", 00:16:27.473 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:27.473 "is_configured": true, 00:16:27.473 "data_offset": 2048, 00:16:27.473 "data_size": 63488 00:16:27.473 }, 00:16:27.473 { 00:16:27.473 "name": null, 00:16:27.473 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:27.473 "is_configured": false, 00:16:27.473 "data_offset": 2048, 00:16:27.473 "data_size": 63488 00:16:27.473 }, 00:16:27.473 { 00:16:27.473 "name": null, 00:16:27.473 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:27.473 "is_configured": false, 00:16:27.473 "data_offset": 2048, 00:16:27.473 "data_size": 63488 00:16:27.473 } 00:16:27.473 ] 00:16:27.473 }' 00:16:27.473 20:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.473 20:30:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.039 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:28.039 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:28.039 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:28.297 [2024-07-15 20:30:20.475189] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:28.297 [2024-07-15 20:30:20.475243] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:28.297 [2024-07-15 20:30:20.475267] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x107ba10 00:16:28.297 [2024-07-15 20:30:20.475279] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:28.297 [2024-07-15 20:30:20.475634] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:28.297 [2024-07-15 20:30:20.475653] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:28.297 [2024-07-15 20:30:20.475718] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:28.297 [2024-07-15 20:30:20.475738] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:28.297 pt2 00:16:28.297 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:28.297 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:28.297 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:28.555 [2024-07-15 20:30:20.723835] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:28.555 [2024-07-15 20:30:20.723865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:28.555 [2024-07-15 20:30:20.723881] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1215740 00:16:28.555 [2024-07-15 20:30:20.723894] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:28.555 [2024-07-15 20:30:20.724184] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:28.555 [2024-07-15 20:30:20.724202] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:28.555 [2024-07-15 20:30:20.724251] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:28.555 [2024-07-15 20:30:20.724268] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:28.555 [2024-07-15 20:30:20.724374] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1215c00 00:16:28.555 [2024-07-15 20:30:20.724384] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:28.555 [2024-07-15 20:30:20.724548] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x107aa40 00:16:28.555 [2024-07-15 20:30:20.724668] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1215c00 00:16:28.555 [2024-07-15 20:30:20.724678] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1215c00 00:16:28.555 [2024-07-15 20:30:20.724769] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:28.555 pt3 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.555 20:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:28.813 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.813 "name": "raid_bdev1", 00:16:28.813 "uuid": "3fdffc1d-d837-415a-879d-6e47b01bb8e2", 00:16:28.813 "strip_size_kb": 64, 00:16:28.813 "state": "online", 00:16:28.813 "raid_level": "concat", 00:16:28.813 "superblock": true, 00:16:28.813 "num_base_bdevs": 3, 00:16:28.813 "num_base_bdevs_discovered": 3, 00:16:28.813 "num_base_bdevs_operational": 3, 00:16:28.813 "base_bdevs_list": [ 00:16:28.813 { 00:16:28.813 "name": "pt1", 00:16:28.813 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:28.813 "is_configured": true, 00:16:28.813 "data_offset": 2048, 00:16:28.813 "data_size": 63488 00:16:28.813 }, 00:16:28.813 { 00:16:28.813 "name": "pt2", 00:16:28.813 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:28.814 "is_configured": true, 00:16:28.814 "data_offset": 2048, 00:16:28.814 "data_size": 63488 00:16:28.814 }, 00:16:28.814 { 00:16:28.814 "name": "pt3", 00:16:28.814 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:28.814 "is_configured": true, 00:16:28.814 "data_offset": 2048, 00:16:28.814 "data_size": 63488 00:16:28.814 } 00:16:28.814 ] 00:16:28.814 }' 00:16:28.814 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.814 20:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.379 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:29.379 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:29.379 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:29.379 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:29.379 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:29.379 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:29.379 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:29.379 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:29.379 [2024-07-15 20:30:21.750887] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:29.681 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:29.681 "name": "raid_bdev1", 00:16:29.681 "aliases": [ 00:16:29.681 "3fdffc1d-d837-415a-879d-6e47b01bb8e2" 00:16:29.681 ], 00:16:29.681 "product_name": "Raid Volume", 00:16:29.681 "block_size": 512, 00:16:29.681 "num_blocks": 190464, 00:16:29.681 "uuid": "3fdffc1d-d837-415a-879d-6e47b01bb8e2", 00:16:29.681 "assigned_rate_limits": { 00:16:29.681 "rw_ios_per_sec": 0, 00:16:29.682 "rw_mbytes_per_sec": 0, 00:16:29.682 "r_mbytes_per_sec": 0, 00:16:29.682 "w_mbytes_per_sec": 0 00:16:29.682 }, 00:16:29.682 "claimed": false, 00:16:29.682 "zoned": false, 00:16:29.682 "supported_io_types": { 00:16:29.682 "read": true, 00:16:29.682 "write": true, 00:16:29.682 "unmap": true, 00:16:29.682 "flush": true, 00:16:29.682 "reset": true, 00:16:29.682 "nvme_admin": false, 00:16:29.682 "nvme_io": false, 00:16:29.682 "nvme_io_md": false, 00:16:29.682 "write_zeroes": true, 00:16:29.682 "zcopy": false, 00:16:29.682 "get_zone_info": false, 00:16:29.682 "zone_management": false, 00:16:29.682 "zone_append": false, 00:16:29.682 "compare": false, 00:16:29.682 "compare_and_write": false, 00:16:29.682 "abort": false, 00:16:29.682 "seek_hole": false, 00:16:29.682 "seek_data": false, 00:16:29.682 "copy": false, 00:16:29.682 "nvme_iov_md": false 00:16:29.682 }, 00:16:29.682 "memory_domains": [ 00:16:29.682 { 00:16:29.682 "dma_device_id": "system", 00:16:29.682 "dma_device_type": 1 00:16:29.682 }, 00:16:29.682 { 00:16:29.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.682 "dma_device_type": 2 00:16:29.682 }, 00:16:29.682 { 00:16:29.682 "dma_device_id": "system", 00:16:29.682 "dma_device_type": 1 00:16:29.682 }, 00:16:29.682 { 00:16:29.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.682 "dma_device_type": 2 00:16:29.682 }, 00:16:29.682 { 00:16:29.682 "dma_device_id": "system", 00:16:29.682 "dma_device_type": 1 00:16:29.682 }, 00:16:29.682 { 00:16:29.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.682 "dma_device_type": 2 00:16:29.682 } 00:16:29.682 ], 00:16:29.682 "driver_specific": { 00:16:29.682 "raid": { 00:16:29.682 "uuid": "3fdffc1d-d837-415a-879d-6e47b01bb8e2", 00:16:29.682 "strip_size_kb": 64, 00:16:29.682 "state": "online", 00:16:29.682 "raid_level": "concat", 00:16:29.682 "superblock": true, 00:16:29.682 "num_base_bdevs": 3, 00:16:29.682 "num_base_bdevs_discovered": 3, 00:16:29.682 "num_base_bdevs_operational": 3, 00:16:29.682 "base_bdevs_list": [ 00:16:29.682 { 00:16:29.682 "name": "pt1", 00:16:29.682 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:29.682 "is_configured": true, 00:16:29.682 "data_offset": 2048, 00:16:29.682 "data_size": 63488 00:16:29.682 }, 00:16:29.682 { 00:16:29.682 "name": "pt2", 00:16:29.682 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:29.682 "is_configured": true, 00:16:29.682 "data_offset": 2048, 00:16:29.682 "data_size": 63488 00:16:29.682 }, 00:16:29.682 { 00:16:29.682 "name": "pt3", 00:16:29.682 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:29.682 "is_configured": true, 00:16:29.682 "data_offset": 2048, 00:16:29.682 "data_size": 63488 00:16:29.682 } 00:16:29.682 ] 00:16:29.682 } 00:16:29.682 } 00:16:29.682 }' 00:16:29.682 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:29.682 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:29.682 pt2 00:16:29.682 pt3' 00:16:29.682 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:29.682 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:29.682 20:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:29.941 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:29.941 "name": "pt1", 00:16:29.941 "aliases": [ 00:16:29.941 "00000000-0000-0000-0000-000000000001" 00:16:29.941 ], 00:16:29.941 "product_name": "passthru", 00:16:29.941 "block_size": 512, 00:16:29.941 "num_blocks": 65536, 00:16:29.941 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:29.941 "assigned_rate_limits": { 00:16:29.941 "rw_ios_per_sec": 0, 00:16:29.941 "rw_mbytes_per_sec": 0, 00:16:29.941 "r_mbytes_per_sec": 0, 00:16:29.941 "w_mbytes_per_sec": 0 00:16:29.941 }, 00:16:29.941 "claimed": true, 00:16:29.941 "claim_type": "exclusive_write", 00:16:29.941 "zoned": false, 00:16:29.941 "supported_io_types": { 00:16:29.941 "read": true, 00:16:29.941 "write": true, 00:16:29.941 "unmap": true, 00:16:29.941 "flush": true, 00:16:29.941 "reset": true, 00:16:29.941 "nvme_admin": false, 00:16:29.941 "nvme_io": false, 00:16:29.941 "nvme_io_md": false, 00:16:29.941 "write_zeroes": true, 00:16:29.941 "zcopy": true, 00:16:29.941 "get_zone_info": false, 00:16:29.942 "zone_management": false, 00:16:29.942 "zone_append": false, 00:16:29.942 "compare": false, 00:16:29.942 "compare_and_write": false, 00:16:29.942 "abort": true, 00:16:29.942 "seek_hole": false, 00:16:29.942 "seek_data": false, 00:16:29.942 "copy": true, 00:16:29.942 "nvme_iov_md": false 00:16:29.942 }, 00:16:29.942 "memory_domains": [ 00:16:29.942 { 00:16:29.942 "dma_device_id": "system", 00:16:29.942 "dma_device_type": 1 00:16:29.942 }, 00:16:29.942 { 00:16:29.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.942 "dma_device_type": 2 00:16:29.942 } 00:16:29.942 ], 00:16:29.942 "driver_specific": { 00:16:29.942 "passthru": { 00:16:29.942 "name": "pt1", 00:16:29.942 "base_bdev_name": "malloc1" 00:16:29.942 } 00:16:29.942 } 00:16:29.942 }' 00:16:29.942 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.942 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:29.942 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:29.942 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.942 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:29.942 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:29.942 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:29.942 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.200 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:30.200 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.200 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.200 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:30.200 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:30.200 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:30.200 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:30.459 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:30.459 "name": "pt2", 00:16:30.459 "aliases": [ 00:16:30.459 "00000000-0000-0000-0000-000000000002" 00:16:30.459 ], 00:16:30.459 "product_name": "passthru", 00:16:30.459 "block_size": 512, 00:16:30.459 "num_blocks": 65536, 00:16:30.459 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:30.459 "assigned_rate_limits": { 00:16:30.459 "rw_ios_per_sec": 0, 00:16:30.459 "rw_mbytes_per_sec": 0, 00:16:30.459 "r_mbytes_per_sec": 0, 00:16:30.459 "w_mbytes_per_sec": 0 00:16:30.459 }, 00:16:30.459 "claimed": true, 00:16:30.459 "claim_type": "exclusive_write", 00:16:30.459 "zoned": false, 00:16:30.459 "supported_io_types": { 00:16:30.459 "read": true, 00:16:30.459 "write": true, 00:16:30.459 "unmap": true, 00:16:30.459 "flush": true, 00:16:30.459 "reset": true, 00:16:30.459 "nvme_admin": false, 00:16:30.459 "nvme_io": false, 00:16:30.459 "nvme_io_md": false, 00:16:30.459 "write_zeroes": true, 00:16:30.459 "zcopy": true, 00:16:30.459 "get_zone_info": false, 00:16:30.459 "zone_management": false, 00:16:30.459 "zone_append": false, 00:16:30.459 "compare": false, 00:16:30.459 "compare_and_write": false, 00:16:30.459 "abort": true, 00:16:30.459 "seek_hole": false, 00:16:30.459 "seek_data": false, 00:16:30.459 "copy": true, 00:16:30.459 "nvme_iov_md": false 00:16:30.459 }, 00:16:30.459 "memory_domains": [ 00:16:30.459 { 00:16:30.459 "dma_device_id": "system", 00:16:30.459 "dma_device_type": 1 00:16:30.459 }, 00:16:30.459 { 00:16:30.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.459 "dma_device_type": 2 00:16:30.459 } 00:16:30.459 ], 00:16:30.459 "driver_specific": { 00:16:30.459 "passthru": { 00:16:30.459 "name": "pt2", 00:16:30.459 "base_bdev_name": "malloc2" 00:16:30.459 } 00:16:30.459 } 00:16:30.459 }' 00:16:30.459 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.459 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.459 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:30.459 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:30.459 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:30.459 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:30.718 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.718 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:30.718 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:30.718 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.718 20:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:30.718 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:30.718 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:30.718 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:30.718 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:30.976 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:30.976 "name": "pt3", 00:16:30.976 "aliases": [ 00:16:30.976 "00000000-0000-0000-0000-000000000003" 00:16:30.976 ], 00:16:30.976 "product_name": "passthru", 00:16:30.976 "block_size": 512, 00:16:30.976 "num_blocks": 65536, 00:16:30.976 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:30.976 "assigned_rate_limits": { 00:16:30.976 "rw_ios_per_sec": 0, 00:16:30.976 "rw_mbytes_per_sec": 0, 00:16:30.976 "r_mbytes_per_sec": 0, 00:16:30.976 "w_mbytes_per_sec": 0 00:16:30.976 }, 00:16:30.976 "claimed": true, 00:16:30.976 "claim_type": "exclusive_write", 00:16:30.976 "zoned": false, 00:16:30.976 "supported_io_types": { 00:16:30.976 "read": true, 00:16:30.976 "write": true, 00:16:30.976 "unmap": true, 00:16:30.976 "flush": true, 00:16:30.976 "reset": true, 00:16:30.976 "nvme_admin": false, 00:16:30.976 "nvme_io": false, 00:16:30.976 "nvme_io_md": false, 00:16:30.976 "write_zeroes": true, 00:16:30.976 "zcopy": true, 00:16:30.976 "get_zone_info": false, 00:16:30.976 "zone_management": false, 00:16:30.976 "zone_append": false, 00:16:30.976 "compare": false, 00:16:30.976 "compare_and_write": false, 00:16:30.976 "abort": true, 00:16:30.976 "seek_hole": false, 00:16:30.976 "seek_data": false, 00:16:30.976 "copy": true, 00:16:30.976 "nvme_iov_md": false 00:16:30.976 }, 00:16:30.976 "memory_domains": [ 00:16:30.976 { 00:16:30.976 "dma_device_id": "system", 00:16:30.976 "dma_device_type": 1 00:16:30.976 }, 00:16:30.976 { 00:16:30.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.976 "dma_device_type": 2 00:16:30.976 } 00:16:30.976 ], 00:16:30.976 "driver_specific": { 00:16:30.976 "passthru": { 00:16:30.976 "name": "pt3", 00:16:30.976 "base_bdev_name": "malloc3" 00:16:30.976 } 00:16:30.976 } 00:16:30.976 }' 00:16:30.976 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:30.976 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:31.234 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:31.234 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:31.234 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:31.234 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:31.234 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:31.234 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:31.234 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:31.234 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:31.234 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:31.493 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:31.493 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:31.493 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:31.493 [2024-07-15 20:30:23.860486] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 3fdffc1d-d837-415a-879d-6e47b01bb8e2 '!=' 3fdffc1d-d837-415a-879d-6e47b01bb8e2 ']' 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1393708 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1393708 ']' 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1393708 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1393708 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1393708' 00:16:31.752 killing process with pid 1393708 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1393708 00:16:31.752 [2024-07-15 20:30:23.930920] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:31.752 [2024-07-15 20:30:23.930983] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:31.752 [2024-07-15 20:30:23.931043] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:31.752 [2024-07-15 20:30:23.931056] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1215c00 name raid_bdev1, state offline 00:16:31.752 20:30:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1393708 00:16:31.752 [2024-07-15 20:30:23.962258] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:32.012 20:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:32.012 00:16:32.012 real 0m14.927s 00:16:32.012 user 0m26.874s 00:16:32.012 sys 0m2.682s 00:16:32.012 20:30:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:32.012 20:30:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:32.012 ************************************ 00:16:32.012 END TEST raid_superblock_test 00:16:32.012 ************************************ 00:16:32.012 20:30:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:32.012 20:30:24 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:16:32.012 20:30:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:32.012 20:30:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:32.012 20:30:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:32.012 ************************************ 00:16:32.012 START TEST raid_read_error_test 00:16:32.012 ************************************ 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.NNyu7gSWVr 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1395935 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1395935 /var/tmp/spdk-raid.sock 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1395935 ']' 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:32.012 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:32.012 20:30:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:32.012 [2024-07-15 20:30:24.361883] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:16:32.012 [2024-07-15 20:30:24.361960] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1395935 ] 00:16:32.271 [2024-07-15 20:30:24.492546] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:32.271 [2024-07-15 20:30:24.601200] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:32.531 [2024-07-15 20:30:24.672364] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:32.531 [2024-07-15 20:30:24.672399] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:32.531 20:30:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:32.531 20:30:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:32.531 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:32.531 20:30:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:32.790 BaseBdev1_malloc 00:16:32.790 20:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:33.049 true 00:16:33.049 20:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:33.308 [2024-07-15 20:30:25.543588] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:33.308 [2024-07-15 20:30:25.543636] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:33.308 [2024-07-15 20:30:25.543660] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2e0d0 00:16:33.308 [2024-07-15 20:30:25.543672] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:33.308 [2024-07-15 20:30:25.545583] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:33.308 [2024-07-15 20:30:25.545613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:33.308 BaseBdev1 00:16:33.308 20:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:33.308 20:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:33.568 BaseBdev2_malloc 00:16:33.568 20:30:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:33.827 true 00:16:33.827 20:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:34.086 [2024-07-15 20:30:26.287407] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:34.086 [2024-07-15 20:30:26.287449] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:34.086 [2024-07-15 20:30:26.287471] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f32910 00:16:34.086 [2024-07-15 20:30:26.287484] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:34.086 [2024-07-15 20:30:26.289043] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:34.086 [2024-07-15 20:30:26.289071] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:34.086 BaseBdev2 00:16:34.086 20:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:34.086 20:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:34.346 BaseBdev3_malloc 00:16:34.346 20:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:34.605 true 00:16:34.864 20:30:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:34.864 [2024-07-15 20:30:27.218433] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:34.864 [2024-07-15 20:30:27.218477] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:34.864 [2024-07-15 20:30:27.218500] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f34bd0 00:16:34.864 [2024-07-15 20:30:27.218513] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:34.864 [2024-07-15 20:30:27.220122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:34.864 [2024-07-15 20:30:27.220150] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:34.864 BaseBdev3 00:16:34.864 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:35.124 [2024-07-15 20:30:27.463116] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:35.124 [2024-07-15 20:30:27.464498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:35.124 [2024-07-15 20:30:27.464569] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:35.124 [2024-07-15 20:30:27.464784] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f36280 00:16:35.124 [2024-07-15 20:30:27.464797] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:35.124 [2024-07-15 20:30:27.465009] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f35e20 00:16:35.124 [2024-07-15 20:30:27.465158] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f36280 00:16:35.124 [2024-07-15 20:30:27.465168] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f36280 00:16:35.124 [2024-07-15 20:30:27.465276] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.124 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:35.384 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.384 "name": "raid_bdev1", 00:16:35.384 "uuid": "bf7be98a-6f85-40a6-b9a1-7df4a9d6030b", 00:16:35.384 "strip_size_kb": 64, 00:16:35.384 "state": "online", 00:16:35.384 "raid_level": "concat", 00:16:35.384 "superblock": true, 00:16:35.384 "num_base_bdevs": 3, 00:16:35.384 "num_base_bdevs_discovered": 3, 00:16:35.384 "num_base_bdevs_operational": 3, 00:16:35.384 "base_bdevs_list": [ 00:16:35.384 { 00:16:35.384 "name": "BaseBdev1", 00:16:35.384 "uuid": "9d8c933d-d720-5c15-8244-3fddec117ae4", 00:16:35.384 "is_configured": true, 00:16:35.384 "data_offset": 2048, 00:16:35.384 "data_size": 63488 00:16:35.384 }, 00:16:35.384 { 00:16:35.384 "name": "BaseBdev2", 00:16:35.384 "uuid": "bc88e664-e8b8-5953-8bc0-645fdcba5879", 00:16:35.384 "is_configured": true, 00:16:35.384 "data_offset": 2048, 00:16:35.384 "data_size": 63488 00:16:35.384 }, 00:16:35.384 { 00:16:35.384 "name": "BaseBdev3", 00:16:35.384 "uuid": "62cfbe01-8d2e-5316-86ea-33747cd619bf", 00:16:35.384 "is_configured": true, 00:16:35.384 "data_offset": 2048, 00:16:35.384 "data_size": 63488 00:16:35.384 } 00:16:35.384 ] 00:16:35.384 }' 00:16:35.384 20:30:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.384 20:30:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.952 20:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:35.952 20:30:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:36.211 [2024-07-15 20:30:28.405885] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d844d0 00:16:37.150 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.409 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:37.668 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.668 "name": "raid_bdev1", 00:16:37.668 "uuid": "bf7be98a-6f85-40a6-b9a1-7df4a9d6030b", 00:16:37.668 "strip_size_kb": 64, 00:16:37.668 "state": "online", 00:16:37.668 "raid_level": "concat", 00:16:37.668 "superblock": true, 00:16:37.668 "num_base_bdevs": 3, 00:16:37.668 "num_base_bdevs_discovered": 3, 00:16:37.668 "num_base_bdevs_operational": 3, 00:16:37.668 "base_bdevs_list": [ 00:16:37.668 { 00:16:37.668 "name": "BaseBdev1", 00:16:37.668 "uuid": "9d8c933d-d720-5c15-8244-3fddec117ae4", 00:16:37.668 "is_configured": true, 00:16:37.668 "data_offset": 2048, 00:16:37.668 "data_size": 63488 00:16:37.668 }, 00:16:37.668 { 00:16:37.668 "name": "BaseBdev2", 00:16:37.668 "uuid": "bc88e664-e8b8-5953-8bc0-645fdcba5879", 00:16:37.668 "is_configured": true, 00:16:37.668 "data_offset": 2048, 00:16:37.668 "data_size": 63488 00:16:37.668 }, 00:16:37.668 { 00:16:37.668 "name": "BaseBdev3", 00:16:37.668 "uuid": "62cfbe01-8d2e-5316-86ea-33747cd619bf", 00:16:37.668 "is_configured": true, 00:16:37.668 "data_offset": 2048, 00:16:37.668 "data_size": 63488 00:16:37.668 } 00:16:37.668 ] 00:16:37.668 }' 00:16:37.668 20:30:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:37.668 20:30:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:38.236 20:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:38.236 [2024-07-15 20:30:30.558289] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:38.236 [2024-07-15 20:30:30.558332] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:38.236 [2024-07-15 20:30:30.561514] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:38.236 [2024-07-15 20:30:30.561552] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:38.236 [2024-07-15 20:30:30.561593] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:38.236 [2024-07-15 20:30:30.561605] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f36280 name raid_bdev1, state offline 00:16:38.236 0 00:16:38.236 20:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1395935 00:16:38.236 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1395935 ']' 00:16:38.236 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1395935 00:16:38.236 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:38.236 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:38.236 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1395935 00:16:38.496 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:38.496 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:38.496 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1395935' 00:16:38.496 killing process with pid 1395935 00:16:38.496 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1395935 00:16:38.496 [2024-07-15 20:30:30.632016] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:38.496 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1395935 00:16:38.496 [2024-07-15 20:30:30.653382] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:38.756 20:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.NNyu7gSWVr 00:16:38.756 20:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:38.756 20:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:38.756 20:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:16:38.756 20:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:38.756 20:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:38.756 20:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:38.756 20:30:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:16:38.756 00:16:38.756 real 0m6.612s 00:16:38.756 user 0m10.788s 00:16:38.756 sys 0m1.257s 00:16:38.756 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:38.756 20:30:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:38.756 ************************************ 00:16:38.756 END TEST raid_read_error_test 00:16:38.756 ************************************ 00:16:38.756 20:30:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:38.756 20:30:30 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:16:38.756 20:30:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:38.756 20:30:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:38.756 20:30:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:38.756 ************************************ 00:16:38.756 START TEST raid_write_error_test 00:16:38.756 ************************************ 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.jfEbD0jNfj 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1396898 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1396898 /var/tmp/spdk-raid.sock 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1396898 ']' 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:38.756 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:38.756 20:30:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:38.756 [2024-07-15 20:30:31.052275] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:16:38.756 [2024-07-15 20:30:31.052340] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1396898 ] 00:16:39.016 [2024-07-15 20:30:31.182803] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:39.016 [2024-07-15 20:30:31.290120] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:39.016 [2024-07-15 20:30:31.353343] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:39.016 [2024-07-15 20:30:31.353373] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:39.584 20:30:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:39.584 20:30:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:39.584 20:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:39.584 20:30:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:39.843 BaseBdev1_malloc 00:16:39.843 20:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:40.101 true 00:16:40.102 20:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:40.361 [2024-07-15 20:30:32.578856] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:40.361 [2024-07-15 20:30:32.578903] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:40.361 [2024-07-15 20:30:32.578935] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de90d0 00:16:40.361 [2024-07-15 20:30:32.578948] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:40.361 [2024-07-15 20:30:32.580812] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:40.361 [2024-07-15 20:30:32.580841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:40.361 BaseBdev1 00:16:40.361 20:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:40.361 20:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:40.621 BaseBdev2_malloc 00:16:40.621 20:30:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:40.880 true 00:16:40.880 20:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:41.139 [2024-07-15 20:30:33.322639] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:41.139 [2024-07-15 20:30:33.322681] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:41.139 [2024-07-15 20:30:33.322702] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ded910 00:16:41.139 [2024-07-15 20:30:33.322715] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:41.139 [2024-07-15 20:30:33.324271] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:41.139 [2024-07-15 20:30:33.324298] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:41.139 BaseBdev2 00:16:41.139 20:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:41.139 20:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:41.398 BaseBdev3_malloc 00:16:41.398 20:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:41.657 true 00:16:41.657 20:30:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:41.916 [2024-07-15 20:30:34.057306] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:41.916 [2024-07-15 20:30:34.057352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:41.917 [2024-07-15 20:30:34.057375] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1defbd0 00:16:41.917 [2024-07-15 20:30:34.057387] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:41.917 [2024-07-15 20:30:34.059009] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:41.917 [2024-07-15 20:30:34.059038] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:41.917 BaseBdev3 00:16:41.917 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:42.176 [2024-07-15 20:30:34.301988] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:42.176 [2024-07-15 20:30:34.303370] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:42.176 [2024-07-15 20:30:34.303439] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:42.176 [2024-07-15 20:30:34.303650] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1df1280 00:16:42.176 [2024-07-15 20:30:34.303661] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:42.176 [2024-07-15 20:30:34.303864] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1df0e20 00:16:42.176 [2024-07-15 20:30:34.304023] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1df1280 00:16:42.176 [2024-07-15 20:30:34.304034] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1df1280 00:16:42.176 [2024-07-15 20:30:34.304141] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.176 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:42.435 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.435 "name": "raid_bdev1", 00:16:42.435 "uuid": "311426c3-ded4-40eb-88de-11ddf0901c14", 00:16:42.435 "strip_size_kb": 64, 00:16:42.435 "state": "online", 00:16:42.435 "raid_level": "concat", 00:16:42.435 "superblock": true, 00:16:42.435 "num_base_bdevs": 3, 00:16:42.435 "num_base_bdevs_discovered": 3, 00:16:42.435 "num_base_bdevs_operational": 3, 00:16:42.435 "base_bdevs_list": [ 00:16:42.435 { 00:16:42.435 "name": "BaseBdev1", 00:16:42.435 "uuid": "e7587d85-2824-5148-90dc-227fe32cb0ea", 00:16:42.435 "is_configured": true, 00:16:42.435 "data_offset": 2048, 00:16:42.435 "data_size": 63488 00:16:42.435 }, 00:16:42.435 { 00:16:42.435 "name": "BaseBdev2", 00:16:42.435 "uuid": "30462763-4c70-533f-b0f8-44904a8b8814", 00:16:42.435 "is_configured": true, 00:16:42.435 "data_offset": 2048, 00:16:42.435 "data_size": 63488 00:16:42.435 }, 00:16:42.435 { 00:16:42.435 "name": "BaseBdev3", 00:16:42.435 "uuid": "0243a6e7-c9f0-5c17-a358-5ded412f8d52", 00:16:42.435 "is_configured": true, 00:16:42.435 "data_offset": 2048, 00:16:42.435 "data_size": 63488 00:16:42.435 } 00:16:42.435 ] 00:16:42.435 }' 00:16:42.435 20:30:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.435 20:30:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:43.003 20:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:43.003 20:30:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:43.003 [2024-07-15 20:30:35.260800] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c3f4d0 00:16:43.940 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.199 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:44.459 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.459 "name": "raid_bdev1", 00:16:44.459 "uuid": "311426c3-ded4-40eb-88de-11ddf0901c14", 00:16:44.459 "strip_size_kb": 64, 00:16:44.459 "state": "online", 00:16:44.459 "raid_level": "concat", 00:16:44.459 "superblock": true, 00:16:44.459 "num_base_bdevs": 3, 00:16:44.459 "num_base_bdevs_discovered": 3, 00:16:44.459 "num_base_bdevs_operational": 3, 00:16:44.459 "base_bdevs_list": [ 00:16:44.459 { 00:16:44.459 "name": "BaseBdev1", 00:16:44.459 "uuid": "e7587d85-2824-5148-90dc-227fe32cb0ea", 00:16:44.459 "is_configured": true, 00:16:44.459 "data_offset": 2048, 00:16:44.459 "data_size": 63488 00:16:44.459 }, 00:16:44.459 { 00:16:44.459 "name": "BaseBdev2", 00:16:44.459 "uuid": "30462763-4c70-533f-b0f8-44904a8b8814", 00:16:44.459 "is_configured": true, 00:16:44.459 "data_offset": 2048, 00:16:44.459 "data_size": 63488 00:16:44.459 }, 00:16:44.459 { 00:16:44.459 "name": "BaseBdev3", 00:16:44.459 "uuid": "0243a6e7-c9f0-5c17-a358-5ded412f8d52", 00:16:44.459 "is_configured": true, 00:16:44.459 "data_offset": 2048, 00:16:44.459 "data_size": 63488 00:16:44.459 } 00:16:44.459 ] 00:16:44.459 }' 00:16:44.459 20:30:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.459 20:30:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:45.028 20:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:45.289 [2024-07-15 20:30:37.507511] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:45.289 [2024-07-15 20:30:37.507553] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:45.289 [2024-07-15 20:30:37.510727] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:45.289 [2024-07-15 20:30:37.510762] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:45.289 [2024-07-15 20:30:37.510798] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:45.289 [2024-07-15 20:30:37.510809] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1df1280 name raid_bdev1, state offline 00:16:45.289 0 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1396898 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1396898 ']' 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1396898 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1396898 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1396898' 00:16:45.289 killing process with pid 1396898 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1396898 00:16:45.289 [2024-07-15 20:30:37.588561] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:45.289 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1396898 00:16:45.289 [2024-07-15 20:30:37.609755] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:45.548 20:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.jfEbD0jNfj 00:16:45.548 20:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:45.548 20:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:45.548 20:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:16:45.548 20:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:45.548 20:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:45.548 20:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:45.548 20:30:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:16:45.548 00:16:45.548 real 0m6.872s 00:16:45.548 user 0m10.850s 00:16:45.548 sys 0m1.226s 00:16:45.548 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:45.548 20:30:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:45.548 ************************************ 00:16:45.548 END TEST raid_write_error_test 00:16:45.548 ************************************ 00:16:45.548 20:30:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:45.548 20:30:37 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:45.548 20:30:37 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:16:45.548 20:30:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:45.548 20:30:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:45.548 20:30:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:45.808 ************************************ 00:16:45.808 START TEST raid_state_function_test 00:16:45.808 ************************************ 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1397878 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1397878' 00:16:45.808 Process raid pid: 1397878 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1397878 /var/tmp/spdk-raid.sock 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1397878 ']' 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:45.808 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:45.808 20:30:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:45.808 [2024-07-15 20:30:38.012858] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:16:45.808 [2024-07-15 20:30:38.012938] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:45.808 [2024-07-15 20:30:38.136762] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:46.067 [2024-07-15 20:30:38.235954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:46.067 [2024-07-15 20:30:38.292400] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:46.067 [2024-07-15 20:30:38.292430] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:46.670 20:30:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:46.670 20:30:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:46.670 20:30:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:46.928 [2024-07-15 20:30:39.136425] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:46.929 [2024-07-15 20:30:39.136468] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:46.929 [2024-07-15 20:30:39.136479] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:46.929 [2024-07-15 20:30:39.136491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:46.929 [2024-07-15 20:30:39.136500] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:46.929 [2024-07-15 20:30:39.136511] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.929 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.188 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.188 "name": "Existed_Raid", 00:16:47.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.188 "strip_size_kb": 0, 00:16:47.188 "state": "configuring", 00:16:47.188 "raid_level": "raid1", 00:16:47.188 "superblock": false, 00:16:47.188 "num_base_bdevs": 3, 00:16:47.188 "num_base_bdevs_discovered": 0, 00:16:47.188 "num_base_bdevs_operational": 3, 00:16:47.188 "base_bdevs_list": [ 00:16:47.188 { 00:16:47.188 "name": "BaseBdev1", 00:16:47.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.188 "is_configured": false, 00:16:47.188 "data_offset": 0, 00:16:47.188 "data_size": 0 00:16:47.188 }, 00:16:47.188 { 00:16:47.188 "name": "BaseBdev2", 00:16:47.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.188 "is_configured": false, 00:16:47.188 "data_offset": 0, 00:16:47.188 "data_size": 0 00:16:47.188 }, 00:16:47.188 { 00:16:47.188 "name": "BaseBdev3", 00:16:47.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.188 "is_configured": false, 00:16:47.188 "data_offset": 0, 00:16:47.188 "data_size": 0 00:16:47.188 } 00:16:47.188 ] 00:16:47.188 }' 00:16:47.188 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.188 20:30:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.755 20:30:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:48.013 [2024-07-15 20:30:40.219167] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:48.013 [2024-07-15 20:30:40.219197] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf00a80 name Existed_Raid, state configuring 00:16:48.013 20:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:48.271 [2024-07-15 20:30:40.467828] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:48.271 [2024-07-15 20:30:40.467856] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:48.271 [2024-07-15 20:30:40.467866] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:48.271 [2024-07-15 20:30:40.467877] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:48.271 [2024-07-15 20:30:40.467886] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:48.271 [2024-07-15 20:30:40.467897] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:48.271 20:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:48.530 [2024-07-15 20:30:40.726390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:48.530 BaseBdev1 00:16:48.530 20:30:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:48.530 20:30:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:48.530 20:30:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:48.530 20:30:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:48.530 20:30:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:48.531 20:30:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:48.531 20:30:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:48.789 20:30:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:49.047 [ 00:16:49.047 { 00:16:49.047 "name": "BaseBdev1", 00:16:49.047 "aliases": [ 00:16:49.047 "99ccc839-27cc-4c62-b6ad-609d592ef4cb" 00:16:49.047 ], 00:16:49.047 "product_name": "Malloc disk", 00:16:49.047 "block_size": 512, 00:16:49.047 "num_blocks": 65536, 00:16:49.047 "uuid": "99ccc839-27cc-4c62-b6ad-609d592ef4cb", 00:16:49.047 "assigned_rate_limits": { 00:16:49.047 "rw_ios_per_sec": 0, 00:16:49.047 "rw_mbytes_per_sec": 0, 00:16:49.047 "r_mbytes_per_sec": 0, 00:16:49.047 "w_mbytes_per_sec": 0 00:16:49.047 }, 00:16:49.047 "claimed": true, 00:16:49.047 "claim_type": "exclusive_write", 00:16:49.047 "zoned": false, 00:16:49.047 "supported_io_types": { 00:16:49.047 "read": true, 00:16:49.047 "write": true, 00:16:49.047 "unmap": true, 00:16:49.047 "flush": true, 00:16:49.047 "reset": true, 00:16:49.047 "nvme_admin": false, 00:16:49.047 "nvme_io": false, 00:16:49.047 "nvme_io_md": false, 00:16:49.047 "write_zeroes": true, 00:16:49.047 "zcopy": true, 00:16:49.047 "get_zone_info": false, 00:16:49.047 "zone_management": false, 00:16:49.047 "zone_append": false, 00:16:49.047 "compare": false, 00:16:49.047 "compare_and_write": false, 00:16:49.047 "abort": true, 00:16:49.047 "seek_hole": false, 00:16:49.047 "seek_data": false, 00:16:49.047 "copy": true, 00:16:49.047 "nvme_iov_md": false 00:16:49.047 }, 00:16:49.047 "memory_domains": [ 00:16:49.047 { 00:16:49.047 "dma_device_id": "system", 00:16:49.047 "dma_device_type": 1 00:16:49.047 }, 00:16:49.047 { 00:16:49.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.047 "dma_device_type": 2 00:16:49.047 } 00:16:49.047 ], 00:16:49.047 "driver_specific": {} 00:16:49.047 } 00:16:49.047 ] 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.047 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.305 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.305 "name": "Existed_Raid", 00:16:49.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.305 "strip_size_kb": 0, 00:16:49.305 "state": "configuring", 00:16:49.305 "raid_level": "raid1", 00:16:49.305 "superblock": false, 00:16:49.305 "num_base_bdevs": 3, 00:16:49.305 "num_base_bdevs_discovered": 1, 00:16:49.305 "num_base_bdevs_operational": 3, 00:16:49.305 "base_bdevs_list": [ 00:16:49.305 { 00:16:49.305 "name": "BaseBdev1", 00:16:49.305 "uuid": "99ccc839-27cc-4c62-b6ad-609d592ef4cb", 00:16:49.305 "is_configured": true, 00:16:49.305 "data_offset": 0, 00:16:49.305 "data_size": 65536 00:16:49.305 }, 00:16:49.305 { 00:16:49.305 "name": "BaseBdev2", 00:16:49.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.305 "is_configured": false, 00:16:49.305 "data_offset": 0, 00:16:49.305 "data_size": 0 00:16:49.305 }, 00:16:49.305 { 00:16:49.305 "name": "BaseBdev3", 00:16:49.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.305 "is_configured": false, 00:16:49.305 "data_offset": 0, 00:16:49.305 "data_size": 0 00:16:49.305 } 00:16:49.305 ] 00:16:49.305 }' 00:16:49.305 20:30:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.305 20:30:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.871 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:50.129 [2024-07-15 20:30:42.330668] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:50.129 [2024-07-15 20:30:42.330707] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf00310 name Existed_Raid, state configuring 00:16:50.129 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:50.386 [2024-07-15 20:30:42.575341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:50.386 [2024-07-15 20:30:42.576823] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:50.386 [2024-07-15 20:30:42.576856] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:50.386 [2024-07-15 20:30:42.576867] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:50.386 [2024-07-15 20:30:42.576879] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.386 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.644 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.644 "name": "Existed_Raid", 00:16:50.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.644 "strip_size_kb": 0, 00:16:50.644 "state": "configuring", 00:16:50.644 "raid_level": "raid1", 00:16:50.644 "superblock": false, 00:16:50.644 "num_base_bdevs": 3, 00:16:50.644 "num_base_bdevs_discovered": 1, 00:16:50.644 "num_base_bdevs_operational": 3, 00:16:50.644 "base_bdevs_list": [ 00:16:50.644 { 00:16:50.644 "name": "BaseBdev1", 00:16:50.644 "uuid": "99ccc839-27cc-4c62-b6ad-609d592ef4cb", 00:16:50.644 "is_configured": true, 00:16:50.644 "data_offset": 0, 00:16:50.644 "data_size": 65536 00:16:50.644 }, 00:16:50.644 { 00:16:50.644 "name": "BaseBdev2", 00:16:50.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.644 "is_configured": false, 00:16:50.644 "data_offset": 0, 00:16:50.644 "data_size": 0 00:16:50.644 }, 00:16:50.644 { 00:16:50.644 "name": "BaseBdev3", 00:16:50.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.644 "is_configured": false, 00:16:50.644 "data_offset": 0, 00:16:50.644 "data_size": 0 00:16:50.644 } 00:16:50.644 ] 00:16:50.644 }' 00:16:50.644 20:30:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.644 20:30:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:51.211 20:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:51.468 [2024-07-15 20:30:43.605538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:51.468 BaseBdev2 00:16:51.468 20:30:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:51.468 20:30:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:51.468 20:30:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:51.468 20:30:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:51.468 20:30:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:51.468 20:30:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:51.468 20:30:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:51.727 20:30:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:51.727 [ 00:16:51.727 { 00:16:51.727 "name": "BaseBdev2", 00:16:51.727 "aliases": [ 00:16:51.727 "ebe24e09-57e9-4c64-b317-1c14938d7ed1" 00:16:51.727 ], 00:16:51.727 "product_name": "Malloc disk", 00:16:51.727 "block_size": 512, 00:16:51.727 "num_blocks": 65536, 00:16:51.727 "uuid": "ebe24e09-57e9-4c64-b317-1c14938d7ed1", 00:16:51.727 "assigned_rate_limits": { 00:16:51.727 "rw_ios_per_sec": 0, 00:16:51.727 "rw_mbytes_per_sec": 0, 00:16:51.727 "r_mbytes_per_sec": 0, 00:16:51.727 "w_mbytes_per_sec": 0 00:16:51.727 }, 00:16:51.727 "claimed": true, 00:16:51.727 "claim_type": "exclusive_write", 00:16:51.727 "zoned": false, 00:16:51.727 "supported_io_types": { 00:16:51.727 "read": true, 00:16:51.727 "write": true, 00:16:51.727 "unmap": true, 00:16:51.727 "flush": true, 00:16:51.727 "reset": true, 00:16:51.727 "nvme_admin": false, 00:16:51.727 "nvme_io": false, 00:16:51.727 "nvme_io_md": false, 00:16:51.727 "write_zeroes": true, 00:16:51.727 "zcopy": true, 00:16:51.727 "get_zone_info": false, 00:16:51.727 "zone_management": false, 00:16:51.727 "zone_append": false, 00:16:51.727 "compare": false, 00:16:51.727 "compare_and_write": false, 00:16:51.727 "abort": true, 00:16:51.727 "seek_hole": false, 00:16:51.727 "seek_data": false, 00:16:51.727 "copy": true, 00:16:51.727 "nvme_iov_md": false 00:16:51.727 }, 00:16:51.727 "memory_domains": [ 00:16:51.727 { 00:16:51.727 "dma_device_id": "system", 00:16:51.727 "dma_device_type": 1 00:16:51.727 }, 00:16:51.727 { 00:16:51.727 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.727 "dma_device_type": 2 00:16:51.727 } 00:16:51.727 ], 00:16:51.727 "driver_specific": {} 00:16:51.727 } 00:16:51.727 ] 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.727 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:51.986 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:51.986 "name": "Existed_Raid", 00:16:51.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:51.986 "strip_size_kb": 0, 00:16:51.986 "state": "configuring", 00:16:51.986 "raid_level": "raid1", 00:16:51.986 "superblock": false, 00:16:51.986 "num_base_bdevs": 3, 00:16:51.986 "num_base_bdevs_discovered": 2, 00:16:51.986 "num_base_bdevs_operational": 3, 00:16:51.986 "base_bdevs_list": [ 00:16:51.986 { 00:16:51.986 "name": "BaseBdev1", 00:16:51.986 "uuid": "99ccc839-27cc-4c62-b6ad-609d592ef4cb", 00:16:51.986 "is_configured": true, 00:16:51.986 "data_offset": 0, 00:16:51.986 "data_size": 65536 00:16:51.986 }, 00:16:51.986 { 00:16:51.986 "name": "BaseBdev2", 00:16:51.986 "uuid": "ebe24e09-57e9-4c64-b317-1c14938d7ed1", 00:16:51.986 "is_configured": true, 00:16:51.986 "data_offset": 0, 00:16:51.986 "data_size": 65536 00:16:51.986 }, 00:16:51.986 { 00:16:51.986 "name": "BaseBdev3", 00:16:51.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:51.986 "is_configured": false, 00:16:51.986 "data_offset": 0, 00:16:51.986 "data_size": 0 00:16:51.986 } 00:16:51.986 ] 00:16:51.986 }' 00:16:51.986 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:51.986 20:30:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.554 20:30:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:52.813 [2024-07-15 20:30:45.084897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:52.813 [2024-07-15 20:30:45.084943] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf01400 00:16:52.813 [2024-07-15 20:30:45.084952] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:52.813 [2024-07-15 20:30:45.085200] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf00ef0 00:16:52.813 [2024-07-15 20:30:45.085324] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf01400 00:16:52.813 [2024-07-15 20:30:45.085334] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf01400 00:16:52.813 [2024-07-15 20:30:45.085492] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:52.813 BaseBdev3 00:16:52.813 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:52.813 20:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:52.813 20:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:52.813 20:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:52.813 20:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:52.813 20:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:52.813 20:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:53.072 20:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:53.072 [ 00:16:53.072 { 00:16:53.072 "name": "BaseBdev3", 00:16:53.072 "aliases": [ 00:16:53.072 "14d5b548-9fc6-4d70-ad7e-5523c09b27c1" 00:16:53.072 ], 00:16:53.072 "product_name": "Malloc disk", 00:16:53.072 "block_size": 512, 00:16:53.072 "num_blocks": 65536, 00:16:53.072 "uuid": "14d5b548-9fc6-4d70-ad7e-5523c09b27c1", 00:16:53.072 "assigned_rate_limits": { 00:16:53.072 "rw_ios_per_sec": 0, 00:16:53.072 "rw_mbytes_per_sec": 0, 00:16:53.072 "r_mbytes_per_sec": 0, 00:16:53.072 "w_mbytes_per_sec": 0 00:16:53.072 }, 00:16:53.072 "claimed": true, 00:16:53.072 "claim_type": "exclusive_write", 00:16:53.072 "zoned": false, 00:16:53.072 "supported_io_types": { 00:16:53.072 "read": true, 00:16:53.072 "write": true, 00:16:53.072 "unmap": true, 00:16:53.072 "flush": true, 00:16:53.072 "reset": true, 00:16:53.072 "nvme_admin": false, 00:16:53.072 "nvme_io": false, 00:16:53.072 "nvme_io_md": false, 00:16:53.072 "write_zeroes": true, 00:16:53.072 "zcopy": true, 00:16:53.072 "get_zone_info": false, 00:16:53.072 "zone_management": false, 00:16:53.072 "zone_append": false, 00:16:53.072 "compare": false, 00:16:53.072 "compare_and_write": false, 00:16:53.072 "abort": true, 00:16:53.072 "seek_hole": false, 00:16:53.072 "seek_data": false, 00:16:53.072 "copy": true, 00:16:53.072 "nvme_iov_md": false 00:16:53.072 }, 00:16:53.072 "memory_domains": [ 00:16:53.072 { 00:16:53.072 "dma_device_id": "system", 00:16:53.072 "dma_device_type": 1 00:16:53.072 }, 00:16:53.072 { 00:16:53.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.072 "dma_device_type": 2 00:16:53.072 } 00:16:53.072 ], 00:16:53.072 "driver_specific": {} 00:16:53.072 } 00:16:53.072 ] 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.331 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:53.591 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.591 "name": "Existed_Raid", 00:16:53.591 "uuid": "1ad73c19-85f9-44c3-b1f4-c69b672beac4", 00:16:53.591 "strip_size_kb": 0, 00:16:53.591 "state": "online", 00:16:53.591 "raid_level": "raid1", 00:16:53.591 "superblock": false, 00:16:53.591 "num_base_bdevs": 3, 00:16:53.591 "num_base_bdevs_discovered": 3, 00:16:53.591 "num_base_bdevs_operational": 3, 00:16:53.591 "base_bdevs_list": [ 00:16:53.591 { 00:16:53.591 "name": "BaseBdev1", 00:16:53.591 "uuid": "99ccc839-27cc-4c62-b6ad-609d592ef4cb", 00:16:53.591 "is_configured": true, 00:16:53.591 "data_offset": 0, 00:16:53.591 "data_size": 65536 00:16:53.591 }, 00:16:53.591 { 00:16:53.591 "name": "BaseBdev2", 00:16:53.591 "uuid": "ebe24e09-57e9-4c64-b317-1c14938d7ed1", 00:16:53.591 "is_configured": true, 00:16:53.591 "data_offset": 0, 00:16:53.591 "data_size": 65536 00:16:53.591 }, 00:16:53.591 { 00:16:53.591 "name": "BaseBdev3", 00:16:53.591 "uuid": "14d5b548-9fc6-4d70-ad7e-5523c09b27c1", 00:16:53.591 "is_configured": true, 00:16:53.591 "data_offset": 0, 00:16:53.591 "data_size": 65536 00:16:53.591 } 00:16:53.591 ] 00:16:53.591 }' 00:16:53.591 20:30:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.591 20:30:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.159 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:54.159 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:54.159 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:54.159 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:54.159 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:54.159 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:54.159 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:54.159 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:54.159 [2024-07-15 20:30:46.525024] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:54.418 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:54.418 "name": "Existed_Raid", 00:16:54.418 "aliases": [ 00:16:54.418 "1ad73c19-85f9-44c3-b1f4-c69b672beac4" 00:16:54.418 ], 00:16:54.418 "product_name": "Raid Volume", 00:16:54.418 "block_size": 512, 00:16:54.418 "num_blocks": 65536, 00:16:54.418 "uuid": "1ad73c19-85f9-44c3-b1f4-c69b672beac4", 00:16:54.418 "assigned_rate_limits": { 00:16:54.418 "rw_ios_per_sec": 0, 00:16:54.418 "rw_mbytes_per_sec": 0, 00:16:54.418 "r_mbytes_per_sec": 0, 00:16:54.418 "w_mbytes_per_sec": 0 00:16:54.418 }, 00:16:54.418 "claimed": false, 00:16:54.418 "zoned": false, 00:16:54.418 "supported_io_types": { 00:16:54.418 "read": true, 00:16:54.418 "write": true, 00:16:54.418 "unmap": false, 00:16:54.418 "flush": false, 00:16:54.418 "reset": true, 00:16:54.418 "nvme_admin": false, 00:16:54.418 "nvme_io": false, 00:16:54.418 "nvme_io_md": false, 00:16:54.418 "write_zeroes": true, 00:16:54.418 "zcopy": false, 00:16:54.418 "get_zone_info": false, 00:16:54.418 "zone_management": false, 00:16:54.418 "zone_append": false, 00:16:54.418 "compare": false, 00:16:54.418 "compare_and_write": false, 00:16:54.418 "abort": false, 00:16:54.418 "seek_hole": false, 00:16:54.418 "seek_data": false, 00:16:54.418 "copy": false, 00:16:54.418 "nvme_iov_md": false 00:16:54.418 }, 00:16:54.418 "memory_domains": [ 00:16:54.418 { 00:16:54.418 "dma_device_id": "system", 00:16:54.418 "dma_device_type": 1 00:16:54.418 }, 00:16:54.418 { 00:16:54.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.418 "dma_device_type": 2 00:16:54.418 }, 00:16:54.418 { 00:16:54.418 "dma_device_id": "system", 00:16:54.418 "dma_device_type": 1 00:16:54.418 }, 00:16:54.418 { 00:16:54.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.418 "dma_device_type": 2 00:16:54.418 }, 00:16:54.418 { 00:16:54.418 "dma_device_id": "system", 00:16:54.418 "dma_device_type": 1 00:16:54.418 }, 00:16:54.418 { 00:16:54.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.418 "dma_device_type": 2 00:16:54.418 } 00:16:54.418 ], 00:16:54.418 "driver_specific": { 00:16:54.418 "raid": { 00:16:54.418 "uuid": "1ad73c19-85f9-44c3-b1f4-c69b672beac4", 00:16:54.418 "strip_size_kb": 0, 00:16:54.418 "state": "online", 00:16:54.418 "raid_level": "raid1", 00:16:54.418 "superblock": false, 00:16:54.418 "num_base_bdevs": 3, 00:16:54.418 "num_base_bdevs_discovered": 3, 00:16:54.418 "num_base_bdevs_operational": 3, 00:16:54.418 "base_bdevs_list": [ 00:16:54.418 { 00:16:54.418 "name": "BaseBdev1", 00:16:54.418 "uuid": "99ccc839-27cc-4c62-b6ad-609d592ef4cb", 00:16:54.418 "is_configured": true, 00:16:54.418 "data_offset": 0, 00:16:54.418 "data_size": 65536 00:16:54.418 }, 00:16:54.418 { 00:16:54.418 "name": "BaseBdev2", 00:16:54.418 "uuid": "ebe24e09-57e9-4c64-b317-1c14938d7ed1", 00:16:54.418 "is_configured": true, 00:16:54.418 "data_offset": 0, 00:16:54.418 "data_size": 65536 00:16:54.418 }, 00:16:54.418 { 00:16:54.418 "name": "BaseBdev3", 00:16:54.418 "uuid": "14d5b548-9fc6-4d70-ad7e-5523c09b27c1", 00:16:54.418 "is_configured": true, 00:16:54.418 "data_offset": 0, 00:16:54.418 "data_size": 65536 00:16:54.418 } 00:16:54.418 ] 00:16:54.418 } 00:16:54.418 } 00:16:54.418 }' 00:16:54.418 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:54.418 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:54.418 BaseBdev2 00:16:54.418 BaseBdev3' 00:16:54.418 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.418 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:54.418 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.677 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.677 "name": "BaseBdev1", 00:16:54.677 "aliases": [ 00:16:54.677 "99ccc839-27cc-4c62-b6ad-609d592ef4cb" 00:16:54.677 ], 00:16:54.677 "product_name": "Malloc disk", 00:16:54.677 "block_size": 512, 00:16:54.677 "num_blocks": 65536, 00:16:54.677 "uuid": "99ccc839-27cc-4c62-b6ad-609d592ef4cb", 00:16:54.677 "assigned_rate_limits": { 00:16:54.677 "rw_ios_per_sec": 0, 00:16:54.677 "rw_mbytes_per_sec": 0, 00:16:54.677 "r_mbytes_per_sec": 0, 00:16:54.677 "w_mbytes_per_sec": 0 00:16:54.677 }, 00:16:54.677 "claimed": true, 00:16:54.677 "claim_type": "exclusive_write", 00:16:54.677 "zoned": false, 00:16:54.677 "supported_io_types": { 00:16:54.677 "read": true, 00:16:54.677 "write": true, 00:16:54.677 "unmap": true, 00:16:54.677 "flush": true, 00:16:54.677 "reset": true, 00:16:54.677 "nvme_admin": false, 00:16:54.677 "nvme_io": false, 00:16:54.677 "nvme_io_md": false, 00:16:54.677 "write_zeroes": true, 00:16:54.677 "zcopy": true, 00:16:54.677 "get_zone_info": false, 00:16:54.677 "zone_management": false, 00:16:54.677 "zone_append": false, 00:16:54.677 "compare": false, 00:16:54.677 "compare_and_write": false, 00:16:54.677 "abort": true, 00:16:54.677 "seek_hole": false, 00:16:54.677 "seek_data": false, 00:16:54.677 "copy": true, 00:16:54.677 "nvme_iov_md": false 00:16:54.677 }, 00:16:54.677 "memory_domains": [ 00:16:54.677 { 00:16:54.677 "dma_device_id": "system", 00:16:54.677 "dma_device_type": 1 00:16:54.677 }, 00:16:54.677 { 00:16:54.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.677 "dma_device_type": 2 00:16:54.677 } 00:16:54.677 ], 00:16:54.677 "driver_specific": {} 00:16:54.677 }' 00:16:54.677 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.677 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.677 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.677 20:30:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.677 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.936 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.936 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.936 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.936 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.936 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.936 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.195 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.195 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:55.195 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:55.195 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:55.454 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:55.454 "name": "BaseBdev2", 00:16:55.454 "aliases": [ 00:16:55.454 "ebe24e09-57e9-4c64-b317-1c14938d7ed1" 00:16:55.454 ], 00:16:55.454 "product_name": "Malloc disk", 00:16:55.454 "block_size": 512, 00:16:55.454 "num_blocks": 65536, 00:16:55.454 "uuid": "ebe24e09-57e9-4c64-b317-1c14938d7ed1", 00:16:55.454 "assigned_rate_limits": { 00:16:55.454 "rw_ios_per_sec": 0, 00:16:55.454 "rw_mbytes_per_sec": 0, 00:16:55.454 "r_mbytes_per_sec": 0, 00:16:55.454 "w_mbytes_per_sec": 0 00:16:55.454 }, 00:16:55.454 "claimed": true, 00:16:55.454 "claim_type": "exclusive_write", 00:16:55.454 "zoned": false, 00:16:55.454 "supported_io_types": { 00:16:55.454 "read": true, 00:16:55.454 "write": true, 00:16:55.454 "unmap": true, 00:16:55.454 "flush": true, 00:16:55.454 "reset": true, 00:16:55.454 "nvme_admin": false, 00:16:55.454 "nvme_io": false, 00:16:55.454 "nvme_io_md": false, 00:16:55.454 "write_zeroes": true, 00:16:55.454 "zcopy": true, 00:16:55.454 "get_zone_info": false, 00:16:55.454 "zone_management": false, 00:16:55.454 "zone_append": false, 00:16:55.454 "compare": false, 00:16:55.454 "compare_and_write": false, 00:16:55.454 "abort": true, 00:16:55.454 "seek_hole": false, 00:16:55.454 "seek_data": false, 00:16:55.454 "copy": true, 00:16:55.454 "nvme_iov_md": false 00:16:55.454 }, 00:16:55.454 "memory_domains": [ 00:16:55.454 { 00:16:55.454 "dma_device_id": "system", 00:16:55.454 "dma_device_type": 1 00:16:55.454 }, 00:16:55.454 { 00:16:55.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.454 "dma_device_type": 2 00:16:55.454 } 00:16:55.454 ], 00:16:55.454 "driver_specific": {} 00:16:55.454 }' 00:16:55.454 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.454 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.454 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.454 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.454 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.454 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.454 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.454 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.713 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.713 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.713 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.713 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.713 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:55.713 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:55.713 20:30:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:56.281 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:56.281 "name": "BaseBdev3", 00:16:56.281 "aliases": [ 00:16:56.281 "14d5b548-9fc6-4d70-ad7e-5523c09b27c1" 00:16:56.281 ], 00:16:56.281 "product_name": "Malloc disk", 00:16:56.281 "block_size": 512, 00:16:56.281 "num_blocks": 65536, 00:16:56.281 "uuid": "14d5b548-9fc6-4d70-ad7e-5523c09b27c1", 00:16:56.281 "assigned_rate_limits": { 00:16:56.281 "rw_ios_per_sec": 0, 00:16:56.281 "rw_mbytes_per_sec": 0, 00:16:56.281 "r_mbytes_per_sec": 0, 00:16:56.281 "w_mbytes_per_sec": 0 00:16:56.281 }, 00:16:56.281 "claimed": true, 00:16:56.281 "claim_type": "exclusive_write", 00:16:56.281 "zoned": false, 00:16:56.281 "supported_io_types": { 00:16:56.281 "read": true, 00:16:56.281 "write": true, 00:16:56.281 "unmap": true, 00:16:56.281 "flush": true, 00:16:56.281 "reset": true, 00:16:56.281 "nvme_admin": false, 00:16:56.281 "nvme_io": false, 00:16:56.281 "nvme_io_md": false, 00:16:56.281 "write_zeroes": true, 00:16:56.281 "zcopy": true, 00:16:56.281 "get_zone_info": false, 00:16:56.281 "zone_management": false, 00:16:56.281 "zone_append": false, 00:16:56.281 "compare": false, 00:16:56.281 "compare_and_write": false, 00:16:56.281 "abort": true, 00:16:56.281 "seek_hole": false, 00:16:56.281 "seek_data": false, 00:16:56.281 "copy": true, 00:16:56.281 "nvme_iov_md": false 00:16:56.281 }, 00:16:56.281 "memory_domains": [ 00:16:56.281 { 00:16:56.281 "dma_device_id": "system", 00:16:56.281 "dma_device_type": 1 00:16:56.281 }, 00:16:56.281 { 00:16:56.281 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.281 "dma_device_type": 2 00:16:56.281 } 00:16:56.281 ], 00:16:56.281 "driver_specific": {} 00:16:56.281 }' 00:16:56.281 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:56.281 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:56.281 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:56.281 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:56.281 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:56.281 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:56.281 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:56.539 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:56.539 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:56.539 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:56.539 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:56.539 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:56.539 20:30:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:56.798 [2024-07-15 20:30:49.087695] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.798 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.364 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.364 "name": "Existed_Raid", 00:16:57.364 "uuid": "1ad73c19-85f9-44c3-b1f4-c69b672beac4", 00:16:57.364 "strip_size_kb": 0, 00:16:57.364 "state": "online", 00:16:57.364 "raid_level": "raid1", 00:16:57.364 "superblock": false, 00:16:57.364 "num_base_bdevs": 3, 00:16:57.364 "num_base_bdevs_discovered": 2, 00:16:57.364 "num_base_bdevs_operational": 2, 00:16:57.364 "base_bdevs_list": [ 00:16:57.364 { 00:16:57.364 "name": null, 00:16:57.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.364 "is_configured": false, 00:16:57.364 "data_offset": 0, 00:16:57.364 "data_size": 65536 00:16:57.364 }, 00:16:57.364 { 00:16:57.364 "name": "BaseBdev2", 00:16:57.364 "uuid": "ebe24e09-57e9-4c64-b317-1c14938d7ed1", 00:16:57.364 "is_configured": true, 00:16:57.364 "data_offset": 0, 00:16:57.364 "data_size": 65536 00:16:57.364 }, 00:16:57.364 { 00:16:57.364 "name": "BaseBdev3", 00:16:57.364 "uuid": "14d5b548-9fc6-4d70-ad7e-5523c09b27c1", 00:16:57.364 "is_configured": true, 00:16:57.364 "data_offset": 0, 00:16:57.364 "data_size": 65536 00:16:57.364 } 00:16:57.364 ] 00:16:57.364 }' 00:16:57.364 20:30:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.364 20:30:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:57.931 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:57.931 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:57.931 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.931 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:58.190 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:58.190 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:58.190 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:58.448 [2024-07-15 20:30:50.705018] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:58.448 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:58.448 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:58.448 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.448 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:58.707 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:58.707 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:58.707 20:30:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:58.965 [2024-07-15 20:30:51.210092] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:58.965 [2024-07-15 20:30:51.210174] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:58.965 [2024-07-15 20:30:51.220985] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:58.965 [2024-07-15 20:30:51.221020] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:58.965 [2024-07-15 20:30:51.221032] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf01400 name Existed_Raid, state offline 00:16:58.965 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:58.965 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:58.965 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.965 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:59.223 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:59.223 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:59.223 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:59.223 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:59.223 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:59.223 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:59.481 BaseBdev2 00:16:59.481 20:30:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:59.481 20:30:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:59.481 20:30:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:59.481 20:30:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:59.481 20:30:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:59.481 20:30:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:59.481 20:30:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:59.740 20:30:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:59.998 [ 00:16:59.998 { 00:16:59.998 "name": "BaseBdev2", 00:16:59.998 "aliases": [ 00:16:59.998 "1062b97e-ad42-493e-bc49-17db94a0a71a" 00:16:59.998 ], 00:16:59.998 "product_name": "Malloc disk", 00:16:59.998 "block_size": 512, 00:16:59.998 "num_blocks": 65536, 00:16:59.998 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:16:59.998 "assigned_rate_limits": { 00:16:59.998 "rw_ios_per_sec": 0, 00:16:59.998 "rw_mbytes_per_sec": 0, 00:16:59.998 "r_mbytes_per_sec": 0, 00:16:59.998 "w_mbytes_per_sec": 0 00:16:59.998 }, 00:16:59.998 "claimed": false, 00:16:59.998 "zoned": false, 00:16:59.998 "supported_io_types": { 00:16:59.998 "read": true, 00:16:59.998 "write": true, 00:16:59.998 "unmap": true, 00:16:59.998 "flush": true, 00:16:59.998 "reset": true, 00:16:59.998 "nvme_admin": false, 00:16:59.998 "nvme_io": false, 00:16:59.998 "nvme_io_md": false, 00:16:59.998 "write_zeroes": true, 00:16:59.998 "zcopy": true, 00:16:59.998 "get_zone_info": false, 00:16:59.998 "zone_management": false, 00:16:59.998 "zone_append": false, 00:16:59.998 "compare": false, 00:16:59.998 "compare_and_write": false, 00:16:59.998 "abort": true, 00:16:59.998 "seek_hole": false, 00:16:59.998 "seek_data": false, 00:16:59.998 "copy": true, 00:16:59.998 "nvme_iov_md": false 00:16:59.998 }, 00:16:59.998 "memory_domains": [ 00:16:59.998 { 00:16:59.998 "dma_device_id": "system", 00:16:59.998 "dma_device_type": 1 00:16:59.998 }, 00:16:59.998 { 00:16:59.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.998 "dma_device_type": 2 00:16:59.998 } 00:16:59.998 ], 00:16:59.998 "driver_specific": {} 00:16:59.998 } 00:16:59.998 ] 00:16:59.998 20:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:59.998 20:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:59.998 20:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:59.998 20:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:00.256 BaseBdev3 00:17:00.256 20:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:00.256 20:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:00.256 20:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:00.256 20:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:00.256 20:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:00.256 20:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:00.256 20:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:00.515 20:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:00.774 [ 00:17:00.774 { 00:17:00.774 "name": "BaseBdev3", 00:17:00.774 "aliases": [ 00:17:00.774 "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451" 00:17:00.774 ], 00:17:00.774 "product_name": "Malloc disk", 00:17:00.774 "block_size": 512, 00:17:00.774 "num_blocks": 65536, 00:17:00.774 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:00.774 "assigned_rate_limits": { 00:17:00.774 "rw_ios_per_sec": 0, 00:17:00.774 "rw_mbytes_per_sec": 0, 00:17:00.774 "r_mbytes_per_sec": 0, 00:17:00.774 "w_mbytes_per_sec": 0 00:17:00.774 }, 00:17:00.774 "claimed": false, 00:17:00.774 "zoned": false, 00:17:00.774 "supported_io_types": { 00:17:00.774 "read": true, 00:17:00.774 "write": true, 00:17:00.774 "unmap": true, 00:17:00.774 "flush": true, 00:17:00.774 "reset": true, 00:17:00.774 "nvme_admin": false, 00:17:00.774 "nvme_io": false, 00:17:00.774 "nvme_io_md": false, 00:17:00.774 "write_zeroes": true, 00:17:00.774 "zcopy": true, 00:17:00.774 "get_zone_info": false, 00:17:00.774 "zone_management": false, 00:17:00.774 "zone_append": false, 00:17:00.774 "compare": false, 00:17:00.774 "compare_and_write": false, 00:17:00.774 "abort": true, 00:17:00.774 "seek_hole": false, 00:17:00.774 "seek_data": false, 00:17:00.774 "copy": true, 00:17:00.774 "nvme_iov_md": false 00:17:00.774 }, 00:17:00.774 "memory_domains": [ 00:17:00.774 { 00:17:00.774 "dma_device_id": "system", 00:17:00.774 "dma_device_type": 1 00:17:00.774 }, 00:17:00.774 { 00:17:00.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.774 "dma_device_type": 2 00:17:00.774 } 00:17:00.774 ], 00:17:00.774 "driver_specific": {} 00:17:00.774 } 00:17:00.774 ] 00:17:00.774 20:30:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:00.774 20:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:00.774 20:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:00.774 20:30:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:01.031 [2024-07-15 20:30:53.179784] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:01.031 [2024-07-15 20:30:53.179825] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:01.031 [2024-07-15 20:30:53.179844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:01.031 [2024-07-15 20:30:53.181171] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.031 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.289 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.289 "name": "Existed_Raid", 00:17:01.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.289 "strip_size_kb": 0, 00:17:01.289 "state": "configuring", 00:17:01.289 "raid_level": "raid1", 00:17:01.289 "superblock": false, 00:17:01.289 "num_base_bdevs": 3, 00:17:01.289 "num_base_bdevs_discovered": 2, 00:17:01.289 "num_base_bdevs_operational": 3, 00:17:01.289 "base_bdevs_list": [ 00:17:01.289 { 00:17:01.289 "name": "BaseBdev1", 00:17:01.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.289 "is_configured": false, 00:17:01.289 "data_offset": 0, 00:17:01.289 "data_size": 0 00:17:01.289 }, 00:17:01.289 { 00:17:01.289 "name": "BaseBdev2", 00:17:01.289 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:17:01.289 "is_configured": true, 00:17:01.289 "data_offset": 0, 00:17:01.289 "data_size": 65536 00:17:01.289 }, 00:17:01.289 { 00:17:01.289 "name": "BaseBdev3", 00:17:01.289 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:01.289 "is_configured": true, 00:17:01.289 "data_offset": 0, 00:17:01.289 "data_size": 65536 00:17:01.289 } 00:17:01.289 ] 00:17:01.289 }' 00:17:01.289 20:30:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.289 20:30:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.854 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:02.113 [2024-07-15 20:30:54.262694] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.113 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:02.371 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.371 "name": "Existed_Raid", 00:17:02.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.371 "strip_size_kb": 0, 00:17:02.371 "state": "configuring", 00:17:02.371 "raid_level": "raid1", 00:17:02.371 "superblock": false, 00:17:02.371 "num_base_bdevs": 3, 00:17:02.371 "num_base_bdevs_discovered": 1, 00:17:02.371 "num_base_bdevs_operational": 3, 00:17:02.371 "base_bdevs_list": [ 00:17:02.371 { 00:17:02.371 "name": "BaseBdev1", 00:17:02.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.371 "is_configured": false, 00:17:02.371 "data_offset": 0, 00:17:02.371 "data_size": 0 00:17:02.371 }, 00:17:02.371 { 00:17:02.371 "name": null, 00:17:02.371 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:17:02.371 "is_configured": false, 00:17:02.371 "data_offset": 0, 00:17:02.371 "data_size": 65536 00:17:02.371 }, 00:17:02.371 { 00:17:02.371 "name": "BaseBdev3", 00:17:02.371 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:02.371 "is_configured": true, 00:17:02.371 "data_offset": 0, 00:17:02.371 "data_size": 65536 00:17:02.371 } 00:17:02.371 ] 00:17:02.371 }' 00:17:02.371 20:30:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.371 20:30:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:02.984 20:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.984 20:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:03.244 20:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:03.244 20:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:03.244 [2024-07-15 20:30:55.537467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:03.244 BaseBdev1 00:17:03.244 20:30:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:03.244 20:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:03.244 20:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:03.244 20:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:03.244 20:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:03.244 20:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:03.244 20:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:03.503 20:30:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:03.761 [ 00:17:03.761 { 00:17:03.761 "name": "BaseBdev1", 00:17:03.761 "aliases": [ 00:17:03.761 "b09c22e1-8472-4df5-8816-cb7a0f1ba86f" 00:17:03.761 ], 00:17:03.761 "product_name": "Malloc disk", 00:17:03.761 "block_size": 512, 00:17:03.761 "num_blocks": 65536, 00:17:03.761 "uuid": "b09c22e1-8472-4df5-8816-cb7a0f1ba86f", 00:17:03.761 "assigned_rate_limits": { 00:17:03.761 "rw_ios_per_sec": 0, 00:17:03.761 "rw_mbytes_per_sec": 0, 00:17:03.761 "r_mbytes_per_sec": 0, 00:17:03.761 "w_mbytes_per_sec": 0 00:17:03.761 }, 00:17:03.761 "claimed": true, 00:17:03.761 "claim_type": "exclusive_write", 00:17:03.761 "zoned": false, 00:17:03.761 "supported_io_types": { 00:17:03.761 "read": true, 00:17:03.761 "write": true, 00:17:03.761 "unmap": true, 00:17:03.761 "flush": true, 00:17:03.761 "reset": true, 00:17:03.761 "nvme_admin": false, 00:17:03.761 "nvme_io": false, 00:17:03.761 "nvme_io_md": false, 00:17:03.761 "write_zeroes": true, 00:17:03.761 "zcopy": true, 00:17:03.761 "get_zone_info": false, 00:17:03.761 "zone_management": false, 00:17:03.761 "zone_append": false, 00:17:03.761 "compare": false, 00:17:03.761 "compare_and_write": false, 00:17:03.761 "abort": true, 00:17:03.761 "seek_hole": false, 00:17:03.761 "seek_data": false, 00:17:03.761 "copy": true, 00:17:03.761 "nvme_iov_md": false 00:17:03.761 }, 00:17:03.761 "memory_domains": [ 00:17:03.761 { 00:17:03.761 "dma_device_id": "system", 00:17:03.761 "dma_device_type": 1 00:17:03.761 }, 00:17:03.761 { 00:17:03.761 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.761 "dma_device_type": 2 00:17:03.761 } 00:17:03.761 ], 00:17:03.761 "driver_specific": {} 00:17:03.761 } 00:17:03.761 ] 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.761 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.020 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.020 "name": "Existed_Raid", 00:17:04.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.020 "strip_size_kb": 0, 00:17:04.020 "state": "configuring", 00:17:04.020 "raid_level": "raid1", 00:17:04.020 "superblock": false, 00:17:04.020 "num_base_bdevs": 3, 00:17:04.020 "num_base_bdevs_discovered": 2, 00:17:04.020 "num_base_bdevs_operational": 3, 00:17:04.020 "base_bdevs_list": [ 00:17:04.020 { 00:17:04.020 "name": "BaseBdev1", 00:17:04.020 "uuid": "b09c22e1-8472-4df5-8816-cb7a0f1ba86f", 00:17:04.020 "is_configured": true, 00:17:04.020 "data_offset": 0, 00:17:04.020 "data_size": 65536 00:17:04.020 }, 00:17:04.020 { 00:17:04.020 "name": null, 00:17:04.020 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:17:04.020 "is_configured": false, 00:17:04.020 "data_offset": 0, 00:17:04.020 "data_size": 65536 00:17:04.020 }, 00:17:04.020 { 00:17:04.020 "name": "BaseBdev3", 00:17:04.020 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:04.020 "is_configured": true, 00:17:04.020 "data_offset": 0, 00:17:04.020 "data_size": 65536 00:17:04.020 } 00:17:04.020 ] 00:17:04.020 }' 00:17:04.020 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.020 20:30:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.586 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:04.586 20:30:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.845 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:04.845 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:05.104 [2024-07-15 20:30:57.366497] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.104 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:05.363 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.363 "name": "Existed_Raid", 00:17:05.363 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.363 "strip_size_kb": 0, 00:17:05.363 "state": "configuring", 00:17:05.363 "raid_level": "raid1", 00:17:05.363 "superblock": false, 00:17:05.363 "num_base_bdevs": 3, 00:17:05.363 "num_base_bdevs_discovered": 1, 00:17:05.363 "num_base_bdevs_operational": 3, 00:17:05.363 "base_bdevs_list": [ 00:17:05.363 { 00:17:05.363 "name": "BaseBdev1", 00:17:05.363 "uuid": "b09c22e1-8472-4df5-8816-cb7a0f1ba86f", 00:17:05.363 "is_configured": true, 00:17:05.363 "data_offset": 0, 00:17:05.363 "data_size": 65536 00:17:05.363 }, 00:17:05.363 { 00:17:05.363 "name": null, 00:17:05.363 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:17:05.363 "is_configured": false, 00:17:05.363 "data_offset": 0, 00:17:05.363 "data_size": 65536 00:17:05.363 }, 00:17:05.363 { 00:17:05.363 "name": null, 00:17:05.363 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:05.363 "is_configured": false, 00:17:05.363 "data_offset": 0, 00:17:05.363 "data_size": 65536 00:17:05.363 } 00:17:05.363 ] 00:17:05.363 }' 00:17:05.363 20:30:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.363 20:30:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.956 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.956 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:06.215 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:06.215 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:06.473 [2024-07-15 20:30:58.714075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.473 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:06.731 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.731 "name": "Existed_Raid", 00:17:06.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.731 "strip_size_kb": 0, 00:17:06.731 "state": "configuring", 00:17:06.731 "raid_level": "raid1", 00:17:06.731 "superblock": false, 00:17:06.731 "num_base_bdevs": 3, 00:17:06.731 "num_base_bdevs_discovered": 2, 00:17:06.731 "num_base_bdevs_operational": 3, 00:17:06.731 "base_bdevs_list": [ 00:17:06.731 { 00:17:06.731 "name": "BaseBdev1", 00:17:06.731 "uuid": "b09c22e1-8472-4df5-8816-cb7a0f1ba86f", 00:17:06.731 "is_configured": true, 00:17:06.731 "data_offset": 0, 00:17:06.731 "data_size": 65536 00:17:06.731 }, 00:17:06.732 { 00:17:06.732 "name": null, 00:17:06.732 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:17:06.732 "is_configured": false, 00:17:06.732 "data_offset": 0, 00:17:06.732 "data_size": 65536 00:17:06.732 }, 00:17:06.732 { 00:17:06.732 "name": "BaseBdev3", 00:17:06.732 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:06.732 "is_configured": true, 00:17:06.732 "data_offset": 0, 00:17:06.732 "data_size": 65536 00:17:06.732 } 00:17:06.732 ] 00:17:06.732 }' 00:17:06.732 20:30:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.732 20:30:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.298 20:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:07.298 20:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.556 20:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:07.556 20:30:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:07.815 [2024-07-15 20:31:00.057655] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.815 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.073 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.073 "name": "Existed_Raid", 00:17:08.073 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.073 "strip_size_kb": 0, 00:17:08.073 "state": "configuring", 00:17:08.073 "raid_level": "raid1", 00:17:08.073 "superblock": false, 00:17:08.073 "num_base_bdevs": 3, 00:17:08.073 "num_base_bdevs_discovered": 1, 00:17:08.073 "num_base_bdevs_operational": 3, 00:17:08.073 "base_bdevs_list": [ 00:17:08.073 { 00:17:08.073 "name": null, 00:17:08.073 "uuid": "b09c22e1-8472-4df5-8816-cb7a0f1ba86f", 00:17:08.073 "is_configured": false, 00:17:08.073 "data_offset": 0, 00:17:08.073 "data_size": 65536 00:17:08.073 }, 00:17:08.073 { 00:17:08.073 "name": null, 00:17:08.073 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:17:08.073 "is_configured": false, 00:17:08.073 "data_offset": 0, 00:17:08.073 "data_size": 65536 00:17:08.073 }, 00:17:08.073 { 00:17:08.073 "name": "BaseBdev3", 00:17:08.073 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:08.073 "is_configured": true, 00:17:08.073 "data_offset": 0, 00:17:08.073 "data_size": 65536 00:17:08.073 } 00:17:08.073 ] 00:17:08.073 }' 00:17:08.073 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.073 20:31:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:08.639 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.639 20:31:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:08.897 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:08.897 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:09.155 [2024-07-15 20:31:01.419649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.155 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:09.414 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.414 "name": "Existed_Raid", 00:17:09.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.414 "strip_size_kb": 0, 00:17:09.414 "state": "configuring", 00:17:09.414 "raid_level": "raid1", 00:17:09.414 "superblock": false, 00:17:09.414 "num_base_bdevs": 3, 00:17:09.414 "num_base_bdevs_discovered": 2, 00:17:09.414 "num_base_bdevs_operational": 3, 00:17:09.414 "base_bdevs_list": [ 00:17:09.414 { 00:17:09.414 "name": null, 00:17:09.414 "uuid": "b09c22e1-8472-4df5-8816-cb7a0f1ba86f", 00:17:09.414 "is_configured": false, 00:17:09.414 "data_offset": 0, 00:17:09.414 "data_size": 65536 00:17:09.414 }, 00:17:09.414 { 00:17:09.414 "name": "BaseBdev2", 00:17:09.414 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:17:09.414 "is_configured": true, 00:17:09.414 "data_offset": 0, 00:17:09.414 "data_size": 65536 00:17:09.414 }, 00:17:09.414 { 00:17:09.414 "name": "BaseBdev3", 00:17:09.414 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:09.414 "is_configured": true, 00:17:09.414 "data_offset": 0, 00:17:09.414 "data_size": 65536 00:17:09.414 } 00:17:09.414 ] 00:17:09.414 }' 00:17:09.414 20:31:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.414 20:31:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.982 20:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.982 20:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:10.241 20:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:10.241 20:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.241 20:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:10.500 20:31:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b09c22e1-8472-4df5-8816-cb7a0f1ba86f 00:17:10.759 [2024-07-15 20:31:03.060528] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:10.759 [2024-07-15 20:31:03.060568] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf04e40 00:17:10.759 [2024-07-15 20:31:03.060576] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:10.759 [2024-07-15 20:31:03.060765] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf01e60 00:17:10.759 [2024-07-15 20:31:03.060886] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf04e40 00:17:10.759 [2024-07-15 20:31:03.060896] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf04e40 00:17:10.759 [2024-07-15 20:31:03.061069] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:10.759 NewBaseBdev 00:17:10.759 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:10.759 20:31:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:10.759 20:31:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:10.759 20:31:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:10.759 20:31:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:10.759 20:31:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:10.759 20:31:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:11.019 20:31:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:11.277 [ 00:17:11.277 { 00:17:11.277 "name": "NewBaseBdev", 00:17:11.277 "aliases": [ 00:17:11.277 "b09c22e1-8472-4df5-8816-cb7a0f1ba86f" 00:17:11.277 ], 00:17:11.277 "product_name": "Malloc disk", 00:17:11.277 "block_size": 512, 00:17:11.277 "num_blocks": 65536, 00:17:11.277 "uuid": "b09c22e1-8472-4df5-8816-cb7a0f1ba86f", 00:17:11.277 "assigned_rate_limits": { 00:17:11.277 "rw_ios_per_sec": 0, 00:17:11.277 "rw_mbytes_per_sec": 0, 00:17:11.277 "r_mbytes_per_sec": 0, 00:17:11.277 "w_mbytes_per_sec": 0 00:17:11.277 }, 00:17:11.277 "claimed": true, 00:17:11.277 "claim_type": "exclusive_write", 00:17:11.277 "zoned": false, 00:17:11.277 "supported_io_types": { 00:17:11.277 "read": true, 00:17:11.277 "write": true, 00:17:11.277 "unmap": true, 00:17:11.277 "flush": true, 00:17:11.277 "reset": true, 00:17:11.277 "nvme_admin": false, 00:17:11.277 "nvme_io": false, 00:17:11.277 "nvme_io_md": false, 00:17:11.277 "write_zeroes": true, 00:17:11.277 "zcopy": true, 00:17:11.277 "get_zone_info": false, 00:17:11.277 "zone_management": false, 00:17:11.277 "zone_append": false, 00:17:11.277 "compare": false, 00:17:11.277 "compare_and_write": false, 00:17:11.277 "abort": true, 00:17:11.277 "seek_hole": false, 00:17:11.277 "seek_data": false, 00:17:11.277 "copy": true, 00:17:11.277 "nvme_iov_md": false 00:17:11.277 }, 00:17:11.277 "memory_domains": [ 00:17:11.278 { 00:17:11.278 "dma_device_id": "system", 00:17:11.278 "dma_device_type": 1 00:17:11.278 }, 00:17:11.278 { 00:17:11.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.278 "dma_device_type": 2 00:17:11.278 } 00:17:11.278 ], 00:17:11.278 "driver_specific": {} 00:17:11.278 } 00:17:11.278 ] 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.278 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.536 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.536 "name": "Existed_Raid", 00:17:11.536 "uuid": "c7412e61-0fa1-4e45-b521-02eec8bd6707", 00:17:11.536 "strip_size_kb": 0, 00:17:11.536 "state": "online", 00:17:11.536 "raid_level": "raid1", 00:17:11.536 "superblock": false, 00:17:11.536 "num_base_bdevs": 3, 00:17:11.536 "num_base_bdevs_discovered": 3, 00:17:11.536 "num_base_bdevs_operational": 3, 00:17:11.536 "base_bdevs_list": [ 00:17:11.536 { 00:17:11.536 "name": "NewBaseBdev", 00:17:11.536 "uuid": "b09c22e1-8472-4df5-8816-cb7a0f1ba86f", 00:17:11.536 "is_configured": true, 00:17:11.536 "data_offset": 0, 00:17:11.536 "data_size": 65536 00:17:11.536 }, 00:17:11.536 { 00:17:11.536 "name": "BaseBdev2", 00:17:11.536 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:17:11.536 "is_configured": true, 00:17:11.536 "data_offset": 0, 00:17:11.536 "data_size": 65536 00:17:11.536 }, 00:17:11.536 { 00:17:11.536 "name": "BaseBdev3", 00:17:11.536 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:11.536 "is_configured": true, 00:17:11.536 "data_offset": 0, 00:17:11.536 "data_size": 65536 00:17:11.536 } 00:17:11.536 ] 00:17:11.536 }' 00:17:11.536 20:31:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.536 20:31:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:12.105 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:12.105 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:12.105 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:12.105 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:12.105 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:12.105 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:12.105 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:12.105 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:12.363 [2024-07-15 20:31:04.693157] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:12.363 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:12.363 "name": "Existed_Raid", 00:17:12.363 "aliases": [ 00:17:12.363 "c7412e61-0fa1-4e45-b521-02eec8bd6707" 00:17:12.363 ], 00:17:12.363 "product_name": "Raid Volume", 00:17:12.363 "block_size": 512, 00:17:12.363 "num_blocks": 65536, 00:17:12.363 "uuid": "c7412e61-0fa1-4e45-b521-02eec8bd6707", 00:17:12.363 "assigned_rate_limits": { 00:17:12.363 "rw_ios_per_sec": 0, 00:17:12.363 "rw_mbytes_per_sec": 0, 00:17:12.363 "r_mbytes_per_sec": 0, 00:17:12.363 "w_mbytes_per_sec": 0 00:17:12.363 }, 00:17:12.363 "claimed": false, 00:17:12.363 "zoned": false, 00:17:12.363 "supported_io_types": { 00:17:12.363 "read": true, 00:17:12.363 "write": true, 00:17:12.363 "unmap": false, 00:17:12.363 "flush": false, 00:17:12.363 "reset": true, 00:17:12.363 "nvme_admin": false, 00:17:12.363 "nvme_io": false, 00:17:12.363 "nvme_io_md": false, 00:17:12.363 "write_zeroes": true, 00:17:12.363 "zcopy": false, 00:17:12.363 "get_zone_info": false, 00:17:12.363 "zone_management": false, 00:17:12.363 "zone_append": false, 00:17:12.363 "compare": false, 00:17:12.363 "compare_and_write": false, 00:17:12.363 "abort": false, 00:17:12.363 "seek_hole": false, 00:17:12.363 "seek_data": false, 00:17:12.363 "copy": false, 00:17:12.363 "nvme_iov_md": false 00:17:12.363 }, 00:17:12.363 "memory_domains": [ 00:17:12.363 { 00:17:12.363 "dma_device_id": "system", 00:17:12.363 "dma_device_type": 1 00:17:12.363 }, 00:17:12.363 { 00:17:12.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.363 "dma_device_type": 2 00:17:12.363 }, 00:17:12.363 { 00:17:12.363 "dma_device_id": "system", 00:17:12.363 "dma_device_type": 1 00:17:12.363 }, 00:17:12.363 { 00:17:12.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.363 "dma_device_type": 2 00:17:12.363 }, 00:17:12.363 { 00:17:12.363 "dma_device_id": "system", 00:17:12.363 "dma_device_type": 1 00:17:12.363 }, 00:17:12.363 { 00:17:12.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.363 "dma_device_type": 2 00:17:12.363 } 00:17:12.363 ], 00:17:12.363 "driver_specific": { 00:17:12.363 "raid": { 00:17:12.363 "uuid": "c7412e61-0fa1-4e45-b521-02eec8bd6707", 00:17:12.363 "strip_size_kb": 0, 00:17:12.363 "state": "online", 00:17:12.363 "raid_level": "raid1", 00:17:12.363 "superblock": false, 00:17:12.363 "num_base_bdevs": 3, 00:17:12.363 "num_base_bdevs_discovered": 3, 00:17:12.363 "num_base_bdevs_operational": 3, 00:17:12.363 "base_bdevs_list": [ 00:17:12.363 { 00:17:12.363 "name": "NewBaseBdev", 00:17:12.363 "uuid": "b09c22e1-8472-4df5-8816-cb7a0f1ba86f", 00:17:12.363 "is_configured": true, 00:17:12.363 "data_offset": 0, 00:17:12.363 "data_size": 65536 00:17:12.363 }, 00:17:12.363 { 00:17:12.363 "name": "BaseBdev2", 00:17:12.363 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:17:12.363 "is_configured": true, 00:17:12.363 "data_offset": 0, 00:17:12.363 "data_size": 65536 00:17:12.363 }, 00:17:12.363 { 00:17:12.363 "name": "BaseBdev3", 00:17:12.363 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:12.363 "is_configured": true, 00:17:12.363 "data_offset": 0, 00:17:12.363 "data_size": 65536 00:17:12.363 } 00:17:12.363 ] 00:17:12.363 } 00:17:12.363 } 00:17:12.363 }' 00:17:12.363 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:12.622 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:12.622 BaseBdev2 00:17:12.622 BaseBdev3' 00:17:12.622 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:12.623 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:12.623 20:31:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:12.881 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:12.881 "name": "NewBaseBdev", 00:17:12.881 "aliases": [ 00:17:12.881 "b09c22e1-8472-4df5-8816-cb7a0f1ba86f" 00:17:12.881 ], 00:17:12.881 "product_name": "Malloc disk", 00:17:12.881 "block_size": 512, 00:17:12.881 "num_blocks": 65536, 00:17:12.881 "uuid": "b09c22e1-8472-4df5-8816-cb7a0f1ba86f", 00:17:12.881 "assigned_rate_limits": { 00:17:12.881 "rw_ios_per_sec": 0, 00:17:12.881 "rw_mbytes_per_sec": 0, 00:17:12.881 "r_mbytes_per_sec": 0, 00:17:12.881 "w_mbytes_per_sec": 0 00:17:12.881 }, 00:17:12.881 "claimed": true, 00:17:12.881 "claim_type": "exclusive_write", 00:17:12.881 "zoned": false, 00:17:12.881 "supported_io_types": { 00:17:12.881 "read": true, 00:17:12.881 "write": true, 00:17:12.881 "unmap": true, 00:17:12.881 "flush": true, 00:17:12.881 "reset": true, 00:17:12.881 "nvme_admin": false, 00:17:12.881 "nvme_io": false, 00:17:12.881 "nvme_io_md": false, 00:17:12.881 "write_zeroes": true, 00:17:12.881 "zcopy": true, 00:17:12.881 "get_zone_info": false, 00:17:12.881 "zone_management": false, 00:17:12.881 "zone_append": false, 00:17:12.881 "compare": false, 00:17:12.881 "compare_and_write": false, 00:17:12.881 "abort": true, 00:17:12.881 "seek_hole": false, 00:17:12.881 "seek_data": false, 00:17:12.881 "copy": true, 00:17:12.881 "nvme_iov_md": false 00:17:12.881 }, 00:17:12.881 "memory_domains": [ 00:17:12.881 { 00:17:12.881 "dma_device_id": "system", 00:17:12.881 "dma_device_type": 1 00:17:12.881 }, 00:17:12.881 { 00:17:12.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.881 "dma_device_type": 2 00:17:12.881 } 00:17:12.881 ], 00:17:12.881 "driver_specific": {} 00:17:12.881 }' 00:17:12.881 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.881 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.881 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:12.881 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.881 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:12.881 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:12.881 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:12.881 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.139 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.139 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.139 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.139 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.139 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:13.139 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:13.139 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:13.398 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:13.398 "name": "BaseBdev2", 00:17:13.398 "aliases": [ 00:17:13.398 "1062b97e-ad42-493e-bc49-17db94a0a71a" 00:17:13.398 ], 00:17:13.398 "product_name": "Malloc disk", 00:17:13.398 "block_size": 512, 00:17:13.398 "num_blocks": 65536, 00:17:13.398 "uuid": "1062b97e-ad42-493e-bc49-17db94a0a71a", 00:17:13.398 "assigned_rate_limits": { 00:17:13.398 "rw_ios_per_sec": 0, 00:17:13.398 "rw_mbytes_per_sec": 0, 00:17:13.398 "r_mbytes_per_sec": 0, 00:17:13.398 "w_mbytes_per_sec": 0 00:17:13.398 }, 00:17:13.398 "claimed": true, 00:17:13.398 "claim_type": "exclusive_write", 00:17:13.398 "zoned": false, 00:17:13.398 "supported_io_types": { 00:17:13.398 "read": true, 00:17:13.398 "write": true, 00:17:13.398 "unmap": true, 00:17:13.398 "flush": true, 00:17:13.398 "reset": true, 00:17:13.398 "nvme_admin": false, 00:17:13.398 "nvme_io": false, 00:17:13.398 "nvme_io_md": false, 00:17:13.398 "write_zeroes": true, 00:17:13.398 "zcopy": true, 00:17:13.398 "get_zone_info": false, 00:17:13.398 "zone_management": false, 00:17:13.398 "zone_append": false, 00:17:13.398 "compare": false, 00:17:13.398 "compare_and_write": false, 00:17:13.398 "abort": true, 00:17:13.398 "seek_hole": false, 00:17:13.398 "seek_data": false, 00:17:13.398 "copy": true, 00:17:13.398 "nvme_iov_md": false 00:17:13.398 }, 00:17:13.398 "memory_domains": [ 00:17:13.398 { 00:17:13.398 "dma_device_id": "system", 00:17:13.398 "dma_device_type": 1 00:17:13.398 }, 00:17:13.398 { 00:17:13.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.399 "dma_device_type": 2 00:17:13.399 } 00:17:13.399 ], 00:17:13.399 "driver_specific": {} 00:17:13.399 }' 00:17:13.399 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:13.399 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:13.657 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:13.657 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.657 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.657 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:13.657 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.657 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.657 20:31:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.657 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.916 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.916 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.916 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:13.916 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:13.916 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:14.175 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:14.175 "name": "BaseBdev3", 00:17:14.175 "aliases": [ 00:17:14.175 "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451" 00:17:14.175 ], 00:17:14.175 "product_name": "Malloc disk", 00:17:14.175 "block_size": 512, 00:17:14.175 "num_blocks": 65536, 00:17:14.175 "uuid": "4d67e184-8efb-45e3-a7ca-ce6c0aa0d451", 00:17:14.175 "assigned_rate_limits": { 00:17:14.175 "rw_ios_per_sec": 0, 00:17:14.175 "rw_mbytes_per_sec": 0, 00:17:14.175 "r_mbytes_per_sec": 0, 00:17:14.175 "w_mbytes_per_sec": 0 00:17:14.175 }, 00:17:14.175 "claimed": true, 00:17:14.175 "claim_type": "exclusive_write", 00:17:14.175 "zoned": false, 00:17:14.175 "supported_io_types": { 00:17:14.175 "read": true, 00:17:14.175 "write": true, 00:17:14.175 "unmap": true, 00:17:14.175 "flush": true, 00:17:14.175 "reset": true, 00:17:14.175 "nvme_admin": false, 00:17:14.175 "nvme_io": false, 00:17:14.175 "nvme_io_md": false, 00:17:14.175 "write_zeroes": true, 00:17:14.175 "zcopy": true, 00:17:14.175 "get_zone_info": false, 00:17:14.175 "zone_management": false, 00:17:14.175 "zone_append": false, 00:17:14.175 "compare": false, 00:17:14.175 "compare_and_write": false, 00:17:14.175 "abort": true, 00:17:14.175 "seek_hole": false, 00:17:14.175 "seek_data": false, 00:17:14.175 "copy": true, 00:17:14.175 "nvme_iov_md": false 00:17:14.175 }, 00:17:14.175 "memory_domains": [ 00:17:14.175 { 00:17:14.175 "dma_device_id": "system", 00:17:14.175 "dma_device_type": 1 00:17:14.175 }, 00:17:14.175 { 00:17:14.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.175 "dma_device_type": 2 00:17:14.175 } 00:17:14.175 ], 00:17:14.175 "driver_specific": {} 00:17:14.175 }' 00:17:14.175 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:14.175 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:14.175 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:14.175 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:14.175 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:14.175 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:14.176 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:14.176 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:14.434 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:14.434 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:14.434 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:14.434 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:14.434 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:14.693 [2024-07-15 20:31:06.930804] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:14.693 [2024-07-15 20:31:06.930831] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:14.693 [2024-07-15 20:31:06.930882] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:14.693 [2024-07-15 20:31:06.931161] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:14.693 [2024-07-15 20:31:06.931174] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf04e40 name Existed_Raid, state offline 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1397878 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1397878 ']' 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1397878 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1397878 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1397878' 00:17:14.693 killing process with pid 1397878 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1397878 00:17:14.693 [2024-07-15 20:31:06.997654] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:14.693 20:31:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1397878 00:17:14.693 [2024-07-15 20:31:07.023517] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:14.953 00:17:14.953 real 0m29.280s 00:17:14.953 user 0m53.783s 00:17:14.953 sys 0m5.190s 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:14.953 ************************************ 00:17:14.953 END TEST raid_state_function_test 00:17:14.953 ************************************ 00:17:14.953 20:31:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:14.953 20:31:07 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:17:14.953 20:31:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:14.953 20:31:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:14.953 20:31:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:14.953 ************************************ 00:17:14.953 START TEST raid_state_function_test_sb 00:17:14.953 ************************************ 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1402327 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1402327' 00:17:14.953 Process raid pid: 1402327 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1402327 /var/tmp/spdk-raid.sock 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1402327 ']' 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:14.953 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:14.953 20:31:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:15.212 [2024-07-15 20:31:07.371281] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:17:15.212 [2024-07-15 20:31:07.371347] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:15.212 [2024-07-15 20:31:07.498642] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:15.471 [2024-07-15 20:31:07.601329] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:15.471 [2024-07-15 20:31:07.665150] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:15.471 [2024-07-15 20:31:07.665186] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:16.409 20:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:16.409 20:31:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:16.409 20:31:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:16.977 [2024-07-15 20:31:09.050130] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:16.977 [2024-07-15 20:31:09.050193] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:16.977 [2024-07-15 20:31:09.050205] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:16.977 [2024-07-15 20:31:09.050217] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:16.977 [2024-07-15 20:31:09.050226] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:16.977 [2024-07-15 20:31:09.050238] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:16.977 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.978 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:17.237 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.237 "name": "Existed_Raid", 00:17:17.237 "uuid": "f8756d43-4c63-413f-a9c0-aa03ddaa0968", 00:17:17.237 "strip_size_kb": 0, 00:17:17.237 "state": "configuring", 00:17:17.237 "raid_level": "raid1", 00:17:17.237 "superblock": true, 00:17:17.237 "num_base_bdevs": 3, 00:17:17.237 "num_base_bdevs_discovered": 0, 00:17:17.237 "num_base_bdevs_operational": 3, 00:17:17.237 "base_bdevs_list": [ 00:17:17.237 { 00:17:17.237 "name": "BaseBdev1", 00:17:17.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.237 "is_configured": false, 00:17:17.237 "data_offset": 0, 00:17:17.237 "data_size": 0 00:17:17.237 }, 00:17:17.237 { 00:17:17.237 "name": "BaseBdev2", 00:17:17.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.237 "is_configured": false, 00:17:17.237 "data_offset": 0, 00:17:17.237 "data_size": 0 00:17:17.237 }, 00:17:17.237 { 00:17:17.237 "name": "BaseBdev3", 00:17:17.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.237 "is_configured": false, 00:17:17.237 "data_offset": 0, 00:17:17.237 "data_size": 0 00:17:17.237 } 00:17:17.237 ] 00:17:17.237 }' 00:17:17.237 20:31:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.238 20:31:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:18.175 20:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:18.175 [2024-07-15 20:31:10.425597] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:18.175 [2024-07-15 20:31:10.425637] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21f6a80 name Existed_Raid, state configuring 00:17:18.175 20:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:18.745 [2024-07-15 20:31:10.946988] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:18.745 [2024-07-15 20:31:10.947028] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:18.745 [2024-07-15 20:31:10.947039] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:18.745 [2024-07-15 20:31:10.947056] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:18.745 [2024-07-15 20:31:10.947065] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:18.745 [2024-07-15 20:31:10.947076] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:18.745 20:31:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:19.315 [2024-07-15 20:31:11.483848] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:19.315 BaseBdev1 00:17:19.315 20:31:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:19.315 20:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:19.315 20:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:19.315 20:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:19.315 20:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:19.315 20:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:19.315 20:31:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:19.898 20:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:19.898 [ 00:17:19.898 { 00:17:19.898 "name": "BaseBdev1", 00:17:19.898 "aliases": [ 00:17:19.898 "6056e12f-c17a-4953-97a4-2e3d6a2313c6" 00:17:19.898 ], 00:17:19.898 "product_name": "Malloc disk", 00:17:19.898 "block_size": 512, 00:17:19.898 "num_blocks": 65536, 00:17:19.898 "uuid": "6056e12f-c17a-4953-97a4-2e3d6a2313c6", 00:17:19.898 "assigned_rate_limits": { 00:17:19.898 "rw_ios_per_sec": 0, 00:17:19.898 "rw_mbytes_per_sec": 0, 00:17:19.898 "r_mbytes_per_sec": 0, 00:17:19.898 "w_mbytes_per_sec": 0 00:17:19.898 }, 00:17:19.898 "claimed": true, 00:17:19.898 "claim_type": "exclusive_write", 00:17:19.898 "zoned": false, 00:17:19.898 "supported_io_types": { 00:17:19.898 "read": true, 00:17:19.898 "write": true, 00:17:19.898 "unmap": true, 00:17:19.898 "flush": true, 00:17:19.898 "reset": true, 00:17:19.898 "nvme_admin": false, 00:17:19.898 "nvme_io": false, 00:17:19.898 "nvme_io_md": false, 00:17:19.898 "write_zeroes": true, 00:17:19.898 "zcopy": true, 00:17:19.898 "get_zone_info": false, 00:17:19.898 "zone_management": false, 00:17:19.898 "zone_append": false, 00:17:19.898 "compare": false, 00:17:19.898 "compare_and_write": false, 00:17:19.898 "abort": true, 00:17:19.898 "seek_hole": false, 00:17:19.898 "seek_data": false, 00:17:19.898 "copy": true, 00:17:19.898 "nvme_iov_md": false 00:17:19.898 }, 00:17:19.898 "memory_domains": [ 00:17:19.898 { 00:17:19.898 "dma_device_id": "system", 00:17:19.898 "dma_device_type": 1 00:17:19.898 }, 00:17:19.898 { 00:17:19.898 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.898 "dma_device_type": 2 00:17:19.898 } 00:17:19.898 ], 00:17:19.898 "driver_specific": {} 00:17:19.898 } 00:17:19.898 ] 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.185 "name": "Existed_Raid", 00:17:20.185 "uuid": "ad7e180d-85e3-4c31-8711-bf183690ff45", 00:17:20.185 "strip_size_kb": 0, 00:17:20.185 "state": "configuring", 00:17:20.185 "raid_level": "raid1", 00:17:20.185 "superblock": true, 00:17:20.185 "num_base_bdevs": 3, 00:17:20.185 "num_base_bdevs_discovered": 1, 00:17:20.185 "num_base_bdevs_operational": 3, 00:17:20.185 "base_bdevs_list": [ 00:17:20.185 { 00:17:20.185 "name": "BaseBdev1", 00:17:20.185 "uuid": "6056e12f-c17a-4953-97a4-2e3d6a2313c6", 00:17:20.185 "is_configured": true, 00:17:20.185 "data_offset": 2048, 00:17:20.185 "data_size": 63488 00:17:20.185 }, 00:17:20.185 { 00:17:20.185 "name": "BaseBdev2", 00:17:20.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.185 "is_configured": false, 00:17:20.185 "data_offset": 0, 00:17:20.185 "data_size": 0 00:17:20.185 }, 00:17:20.185 { 00:17:20.185 "name": "BaseBdev3", 00:17:20.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:20.185 "is_configured": false, 00:17:20.185 "data_offset": 0, 00:17:20.185 "data_size": 0 00:17:20.185 } 00:17:20.185 ] 00:17:20.185 }' 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.185 20:31:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:20.753 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:21.321 [2024-07-15 20:31:13.557361] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:21.321 [2024-07-15 20:31:13.557412] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21f6310 name Existed_Raid, state configuring 00:17:21.321 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:21.580 [2024-07-15 20:31:13.814076] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:21.580 [2024-07-15 20:31:13.815563] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:21.580 [2024-07-15 20:31:13.815601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:21.580 [2024-07-15 20:31:13.815612] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:21.580 [2024-07-15 20:31:13.815624] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.580 20:31:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.147 20:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.147 "name": "Existed_Raid", 00:17:22.147 "uuid": "466f38f6-7cb1-470a-b372-fe5ed8b0dab2", 00:17:22.147 "strip_size_kb": 0, 00:17:22.147 "state": "configuring", 00:17:22.147 "raid_level": "raid1", 00:17:22.147 "superblock": true, 00:17:22.147 "num_base_bdevs": 3, 00:17:22.147 "num_base_bdevs_discovered": 1, 00:17:22.147 "num_base_bdevs_operational": 3, 00:17:22.147 "base_bdevs_list": [ 00:17:22.147 { 00:17:22.147 "name": "BaseBdev1", 00:17:22.147 "uuid": "6056e12f-c17a-4953-97a4-2e3d6a2313c6", 00:17:22.147 "is_configured": true, 00:17:22.147 "data_offset": 2048, 00:17:22.147 "data_size": 63488 00:17:22.147 }, 00:17:22.147 { 00:17:22.147 "name": "BaseBdev2", 00:17:22.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.147 "is_configured": false, 00:17:22.147 "data_offset": 0, 00:17:22.147 "data_size": 0 00:17:22.147 }, 00:17:22.147 { 00:17:22.147 "name": "BaseBdev3", 00:17:22.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.147 "is_configured": false, 00:17:22.147 "data_offset": 0, 00:17:22.147 "data_size": 0 00:17:22.147 } 00:17:22.147 ] 00:17:22.147 }' 00:17:22.147 20:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.147 20:31:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:22.716 20:31:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:23.284 [2024-07-15 20:31:15.395232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:23.284 BaseBdev2 00:17:23.284 20:31:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:23.284 20:31:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:23.284 20:31:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:23.284 20:31:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:23.284 20:31:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:23.284 20:31:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:23.284 20:31:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:23.852 20:31:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:24.112 [ 00:17:24.112 { 00:17:24.112 "name": "BaseBdev2", 00:17:24.112 "aliases": [ 00:17:24.112 "fefce5cd-24ce-419d-b995-4ddbef26a763" 00:17:24.112 ], 00:17:24.112 "product_name": "Malloc disk", 00:17:24.112 "block_size": 512, 00:17:24.112 "num_blocks": 65536, 00:17:24.112 "uuid": "fefce5cd-24ce-419d-b995-4ddbef26a763", 00:17:24.112 "assigned_rate_limits": { 00:17:24.112 "rw_ios_per_sec": 0, 00:17:24.112 "rw_mbytes_per_sec": 0, 00:17:24.112 "r_mbytes_per_sec": 0, 00:17:24.112 "w_mbytes_per_sec": 0 00:17:24.112 }, 00:17:24.112 "claimed": true, 00:17:24.112 "claim_type": "exclusive_write", 00:17:24.112 "zoned": false, 00:17:24.112 "supported_io_types": { 00:17:24.112 "read": true, 00:17:24.112 "write": true, 00:17:24.112 "unmap": true, 00:17:24.112 "flush": true, 00:17:24.112 "reset": true, 00:17:24.112 "nvme_admin": false, 00:17:24.112 "nvme_io": false, 00:17:24.112 "nvme_io_md": false, 00:17:24.112 "write_zeroes": true, 00:17:24.112 "zcopy": true, 00:17:24.112 "get_zone_info": false, 00:17:24.112 "zone_management": false, 00:17:24.112 "zone_append": false, 00:17:24.112 "compare": false, 00:17:24.112 "compare_and_write": false, 00:17:24.112 "abort": true, 00:17:24.112 "seek_hole": false, 00:17:24.112 "seek_data": false, 00:17:24.112 "copy": true, 00:17:24.112 "nvme_iov_md": false 00:17:24.112 }, 00:17:24.112 "memory_domains": [ 00:17:24.112 { 00:17:24.112 "dma_device_id": "system", 00:17:24.112 "dma_device_type": 1 00:17:24.112 }, 00:17:24.112 { 00:17:24.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.112 "dma_device_type": 2 00:17:24.112 } 00:17:24.112 ], 00:17:24.112 "driver_specific": {} 00:17:24.112 } 00:17:24.112 ] 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.112 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.681 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:24.681 "name": "Existed_Raid", 00:17:24.681 "uuid": "466f38f6-7cb1-470a-b372-fe5ed8b0dab2", 00:17:24.681 "strip_size_kb": 0, 00:17:24.681 "state": "configuring", 00:17:24.681 "raid_level": "raid1", 00:17:24.681 "superblock": true, 00:17:24.681 "num_base_bdevs": 3, 00:17:24.681 "num_base_bdevs_discovered": 2, 00:17:24.681 "num_base_bdevs_operational": 3, 00:17:24.681 "base_bdevs_list": [ 00:17:24.681 { 00:17:24.681 "name": "BaseBdev1", 00:17:24.681 "uuid": "6056e12f-c17a-4953-97a4-2e3d6a2313c6", 00:17:24.681 "is_configured": true, 00:17:24.681 "data_offset": 2048, 00:17:24.681 "data_size": 63488 00:17:24.681 }, 00:17:24.681 { 00:17:24.681 "name": "BaseBdev2", 00:17:24.681 "uuid": "fefce5cd-24ce-419d-b995-4ddbef26a763", 00:17:24.681 "is_configured": true, 00:17:24.681 "data_offset": 2048, 00:17:24.681 "data_size": 63488 00:17:24.681 }, 00:17:24.681 { 00:17:24.681 "name": "BaseBdev3", 00:17:24.681 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:24.681 "is_configured": false, 00:17:24.681 "data_offset": 0, 00:17:24.681 "data_size": 0 00:17:24.681 } 00:17:24.681 ] 00:17:24.681 }' 00:17:24.681 20:31:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:24.681 20:31:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:25.618 20:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:25.618 [2024-07-15 20:31:17.927245] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:25.618 [2024-07-15 20:31:17.927429] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21f7400 00:17:25.618 [2024-07-15 20:31:17.927445] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:25.618 [2024-07-15 20:31:17.927630] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21f6ef0 00:17:25.618 [2024-07-15 20:31:17.927761] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21f7400 00:17:25.618 [2024-07-15 20:31:17.927771] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21f7400 00:17:25.618 [2024-07-15 20:31:17.927870] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:25.618 BaseBdev3 00:17:25.618 20:31:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:25.618 20:31:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:25.618 20:31:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:25.618 20:31:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:25.618 20:31:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:25.619 20:31:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:25.619 20:31:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:26.187 20:31:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:26.754 [ 00:17:26.755 { 00:17:26.755 "name": "BaseBdev3", 00:17:26.755 "aliases": [ 00:17:26.755 "7c358ba1-cd92-4e35-a6e0-fa4efed29874" 00:17:26.755 ], 00:17:26.755 "product_name": "Malloc disk", 00:17:26.755 "block_size": 512, 00:17:26.755 "num_blocks": 65536, 00:17:26.755 "uuid": "7c358ba1-cd92-4e35-a6e0-fa4efed29874", 00:17:26.755 "assigned_rate_limits": { 00:17:26.755 "rw_ios_per_sec": 0, 00:17:26.755 "rw_mbytes_per_sec": 0, 00:17:26.755 "r_mbytes_per_sec": 0, 00:17:26.755 "w_mbytes_per_sec": 0 00:17:26.755 }, 00:17:26.755 "claimed": true, 00:17:26.755 "claim_type": "exclusive_write", 00:17:26.755 "zoned": false, 00:17:26.755 "supported_io_types": { 00:17:26.755 "read": true, 00:17:26.755 "write": true, 00:17:26.755 "unmap": true, 00:17:26.755 "flush": true, 00:17:26.755 "reset": true, 00:17:26.755 "nvme_admin": false, 00:17:26.755 "nvme_io": false, 00:17:26.755 "nvme_io_md": false, 00:17:26.755 "write_zeroes": true, 00:17:26.755 "zcopy": true, 00:17:26.755 "get_zone_info": false, 00:17:26.755 "zone_management": false, 00:17:26.755 "zone_append": false, 00:17:26.755 "compare": false, 00:17:26.755 "compare_and_write": false, 00:17:26.755 "abort": true, 00:17:26.755 "seek_hole": false, 00:17:26.755 "seek_data": false, 00:17:26.755 "copy": true, 00:17:26.755 "nvme_iov_md": false 00:17:26.755 }, 00:17:26.755 "memory_domains": [ 00:17:26.755 { 00:17:26.755 "dma_device_id": "system", 00:17:26.755 "dma_device_type": 1 00:17:26.755 }, 00:17:26.755 { 00:17:26.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.755 "dma_device_type": 2 00:17:26.755 } 00:17:26.755 ], 00:17:26.755 "driver_specific": {} 00:17:26.755 } 00:17:26.755 ] 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.755 20:31:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.013 20:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.013 "name": "Existed_Raid", 00:17:27.013 "uuid": "466f38f6-7cb1-470a-b372-fe5ed8b0dab2", 00:17:27.013 "strip_size_kb": 0, 00:17:27.013 "state": "online", 00:17:27.013 "raid_level": "raid1", 00:17:27.013 "superblock": true, 00:17:27.013 "num_base_bdevs": 3, 00:17:27.013 "num_base_bdevs_discovered": 3, 00:17:27.013 "num_base_bdevs_operational": 3, 00:17:27.013 "base_bdevs_list": [ 00:17:27.013 { 00:17:27.013 "name": "BaseBdev1", 00:17:27.013 "uuid": "6056e12f-c17a-4953-97a4-2e3d6a2313c6", 00:17:27.013 "is_configured": true, 00:17:27.013 "data_offset": 2048, 00:17:27.013 "data_size": 63488 00:17:27.013 }, 00:17:27.013 { 00:17:27.013 "name": "BaseBdev2", 00:17:27.013 "uuid": "fefce5cd-24ce-419d-b995-4ddbef26a763", 00:17:27.013 "is_configured": true, 00:17:27.013 "data_offset": 2048, 00:17:27.013 "data_size": 63488 00:17:27.013 }, 00:17:27.013 { 00:17:27.013 "name": "BaseBdev3", 00:17:27.013 "uuid": "7c358ba1-cd92-4e35-a6e0-fa4efed29874", 00:17:27.013 "is_configured": true, 00:17:27.013 "data_offset": 2048, 00:17:27.013 "data_size": 63488 00:17:27.013 } 00:17:27.013 ] 00:17:27.013 }' 00:17:27.013 20:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.013 20:31:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:27.579 20:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:27.579 20:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:27.579 20:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:27.579 20:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:27.579 20:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:27.579 20:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:27.579 20:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:27.579 20:31:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:27.837 [2024-07-15 20:31:20.109342] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:27.837 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:27.837 "name": "Existed_Raid", 00:17:27.837 "aliases": [ 00:17:27.837 "466f38f6-7cb1-470a-b372-fe5ed8b0dab2" 00:17:27.837 ], 00:17:27.837 "product_name": "Raid Volume", 00:17:27.837 "block_size": 512, 00:17:27.837 "num_blocks": 63488, 00:17:27.837 "uuid": "466f38f6-7cb1-470a-b372-fe5ed8b0dab2", 00:17:27.837 "assigned_rate_limits": { 00:17:27.837 "rw_ios_per_sec": 0, 00:17:27.837 "rw_mbytes_per_sec": 0, 00:17:27.837 "r_mbytes_per_sec": 0, 00:17:27.837 "w_mbytes_per_sec": 0 00:17:27.837 }, 00:17:27.837 "claimed": false, 00:17:27.837 "zoned": false, 00:17:27.837 "supported_io_types": { 00:17:27.837 "read": true, 00:17:27.837 "write": true, 00:17:27.837 "unmap": false, 00:17:27.837 "flush": false, 00:17:27.837 "reset": true, 00:17:27.837 "nvme_admin": false, 00:17:27.837 "nvme_io": false, 00:17:27.837 "nvme_io_md": false, 00:17:27.837 "write_zeroes": true, 00:17:27.837 "zcopy": false, 00:17:27.837 "get_zone_info": false, 00:17:27.837 "zone_management": false, 00:17:27.837 "zone_append": false, 00:17:27.837 "compare": false, 00:17:27.837 "compare_and_write": false, 00:17:27.837 "abort": false, 00:17:27.837 "seek_hole": false, 00:17:27.837 "seek_data": false, 00:17:27.837 "copy": false, 00:17:27.837 "nvme_iov_md": false 00:17:27.837 }, 00:17:27.837 "memory_domains": [ 00:17:27.837 { 00:17:27.837 "dma_device_id": "system", 00:17:27.837 "dma_device_type": 1 00:17:27.837 }, 00:17:27.837 { 00:17:27.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.837 "dma_device_type": 2 00:17:27.837 }, 00:17:27.837 { 00:17:27.838 "dma_device_id": "system", 00:17:27.838 "dma_device_type": 1 00:17:27.838 }, 00:17:27.838 { 00:17:27.838 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.838 "dma_device_type": 2 00:17:27.838 }, 00:17:27.838 { 00:17:27.838 "dma_device_id": "system", 00:17:27.838 "dma_device_type": 1 00:17:27.838 }, 00:17:27.838 { 00:17:27.838 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.838 "dma_device_type": 2 00:17:27.838 } 00:17:27.838 ], 00:17:27.838 "driver_specific": { 00:17:27.838 "raid": { 00:17:27.838 "uuid": "466f38f6-7cb1-470a-b372-fe5ed8b0dab2", 00:17:27.838 "strip_size_kb": 0, 00:17:27.838 "state": "online", 00:17:27.838 "raid_level": "raid1", 00:17:27.838 "superblock": true, 00:17:27.838 "num_base_bdevs": 3, 00:17:27.838 "num_base_bdevs_discovered": 3, 00:17:27.838 "num_base_bdevs_operational": 3, 00:17:27.838 "base_bdevs_list": [ 00:17:27.838 { 00:17:27.838 "name": "BaseBdev1", 00:17:27.838 "uuid": "6056e12f-c17a-4953-97a4-2e3d6a2313c6", 00:17:27.838 "is_configured": true, 00:17:27.838 "data_offset": 2048, 00:17:27.838 "data_size": 63488 00:17:27.838 }, 00:17:27.838 { 00:17:27.838 "name": "BaseBdev2", 00:17:27.838 "uuid": "fefce5cd-24ce-419d-b995-4ddbef26a763", 00:17:27.838 "is_configured": true, 00:17:27.838 "data_offset": 2048, 00:17:27.838 "data_size": 63488 00:17:27.838 }, 00:17:27.838 { 00:17:27.838 "name": "BaseBdev3", 00:17:27.838 "uuid": "7c358ba1-cd92-4e35-a6e0-fa4efed29874", 00:17:27.838 "is_configured": true, 00:17:27.838 "data_offset": 2048, 00:17:27.838 "data_size": 63488 00:17:27.838 } 00:17:27.838 ] 00:17:27.838 } 00:17:27.838 } 00:17:27.838 }' 00:17:27.838 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:27.838 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:27.838 BaseBdev2 00:17:27.838 BaseBdev3' 00:17:27.838 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:27.838 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:27.838 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:28.096 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:28.096 "name": "BaseBdev1", 00:17:28.096 "aliases": [ 00:17:28.096 "6056e12f-c17a-4953-97a4-2e3d6a2313c6" 00:17:28.096 ], 00:17:28.096 "product_name": "Malloc disk", 00:17:28.096 "block_size": 512, 00:17:28.096 "num_blocks": 65536, 00:17:28.096 "uuid": "6056e12f-c17a-4953-97a4-2e3d6a2313c6", 00:17:28.096 "assigned_rate_limits": { 00:17:28.096 "rw_ios_per_sec": 0, 00:17:28.096 "rw_mbytes_per_sec": 0, 00:17:28.096 "r_mbytes_per_sec": 0, 00:17:28.096 "w_mbytes_per_sec": 0 00:17:28.096 }, 00:17:28.096 "claimed": true, 00:17:28.096 "claim_type": "exclusive_write", 00:17:28.096 "zoned": false, 00:17:28.096 "supported_io_types": { 00:17:28.096 "read": true, 00:17:28.096 "write": true, 00:17:28.096 "unmap": true, 00:17:28.096 "flush": true, 00:17:28.096 "reset": true, 00:17:28.096 "nvme_admin": false, 00:17:28.096 "nvme_io": false, 00:17:28.096 "nvme_io_md": false, 00:17:28.096 "write_zeroes": true, 00:17:28.096 "zcopy": true, 00:17:28.096 "get_zone_info": false, 00:17:28.096 "zone_management": false, 00:17:28.096 "zone_append": false, 00:17:28.096 "compare": false, 00:17:28.096 "compare_and_write": false, 00:17:28.096 "abort": true, 00:17:28.096 "seek_hole": false, 00:17:28.096 "seek_data": false, 00:17:28.096 "copy": true, 00:17:28.096 "nvme_iov_md": false 00:17:28.096 }, 00:17:28.096 "memory_domains": [ 00:17:28.096 { 00:17:28.096 "dma_device_id": "system", 00:17:28.096 "dma_device_type": 1 00:17:28.096 }, 00:17:28.096 { 00:17:28.096 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.096 "dma_device_type": 2 00:17:28.096 } 00:17:28.096 ], 00:17:28.096 "driver_specific": {} 00:17:28.096 }' 00:17:28.096 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:28.096 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:28.354 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:28.354 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:28.354 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:28.354 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:28.354 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:28.354 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:28.354 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:28.354 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.611 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.611 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:28.611 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:28.611 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:28.611 20:31:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:28.868 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:28.868 "name": "BaseBdev2", 00:17:28.868 "aliases": [ 00:17:28.868 "fefce5cd-24ce-419d-b995-4ddbef26a763" 00:17:28.868 ], 00:17:28.868 "product_name": "Malloc disk", 00:17:28.868 "block_size": 512, 00:17:28.868 "num_blocks": 65536, 00:17:28.868 "uuid": "fefce5cd-24ce-419d-b995-4ddbef26a763", 00:17:28.868 "assigned_rate_limits": { 00:17:28.868 "rw_ios_per_sec": 0, 00:17:28.868 "rw_mbytes_per_sec": 0, 00:17:28.868 "r_mbytes_per_sec": 0, 00:17:28.868 "w_mbytes_per_sec": 0 00:17:28.868 }, 00:17:28.868 "claimed": true, 00:17:28.868 "claim_type": "exclusive_write", 00:17:28.868 "zoned": false, 00:17:28.868 "supported_io_types": { 00:17:28.868 "read": true, 00:17:28.868 "write": true, 00:17:28.868 "unmap": true, 00:17:28.868 "flush": true, 00:17:28.868 "reset": true, 00:17:28.868 "nvme_admin": false, 00:17:28.868 "nvme_io": false, 00:17:28.868 "nvme_io_md": false, 00:17:28.868 "write_zeroes": true, 00:17:28.868 "zcopy": true, 00:17:28.868 "get_zone_info": false, 00:17:28.868 "zone_management": false, 00:17:28.869 "zone_append": false, 00:17:28.869 "compare": false, 00:17:28.869 "compare_and_write": false, 00:17:28.869 "abort": true, 00:17:28.869 "seek_hole": false, 00:17:28.869 "seek_data": false, 00:17:28.869 "copy": true, 00:17:28.869 "nvme_iov_md": false 00:17:28.869 }, 00:17:28.869 "memory_domains": [ 00:17:28.869 { 00:17:28.869 "dma_device_id": "system", 00:17:28.869 "dma_device_type": 1 00:17:28.869 }, 00:17:28.869 { 00:17:28.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.869 "dma_device_type": 2 00:17:28.869 } 00:17:28.869 ], 00:17:28.869 "driver_specific": {} 00:17:28.869 }' 00:17:28.869 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:28.869 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:28.869 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:28.869 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:28.869 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:28.869 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:28.869 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:29.127 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:29.127 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:29.127 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:29.127 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:29.127 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:29.127 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:29.127 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:29.127 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:29.386 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:29.386 "name": "BaseBdev3", 00:17:29.386 "aliases": [ 00:17:29.386 "7c358ba1-cd92-4e35-a6e0-fa4efed29874" 00:17:29.386 ], 00:17:29.386 "product_name": "Malloc disk", 00:17:29.386 "block_size": 512, 00:17:29.386 "num_blocks": 65536, 00:17:29.386 "uuid": "7c358ba1-cd92-4e35-a6e0-fa4efed29874", 00:17:29.386 "assigned_rate_limits": { 00:17:29.386 "rw_ios_per_sec": 0, 00:17:29.386 "rw_mbytes_per_sec": 0, 00:17:29.386 "r_mbytes_per_sec": 0, 00:17:29.386 "w_mbytes_per_sec": 0 00:17:29.386 }, 00:17:29.386 "claimed": true, 00:17:29.386 "claim_type": "exclusive_write", 00:17:29.386 "zoned": false, 00:17:29.386 "supported_io_types": { 00:17:29.386 "read": true, 00:17:29.386 "write": true, 00:17:29.387 "unmap": true, 00:17:29.387 "flush": true, 00:17:29.387 "reset": true, 00:17:29.387 "nvme_admin": false, 00:17:29.387 "nvme_io": false, 00:17:29.387 "nvme_io_md": false, 00:17:29.387 "write_zeroes": true, 00:17:29.387 "zcopy": true, 00:17:29.387 "get_zone_info": false, 00:17:29.387 "zone_management": false, 00:17:29.387 "zone_append": false, 00:17:29.387 "compare": false, 00:17:29.387 "compare_and_write": false, 00:17:29.387 "abort": true, 00:17:29.387 "seek_hole": false, 00:17:29.387 "seek_data": false, 00:17:29.387 "copy": true, 00:17:29.387 "nvme_iov_md": false 00:17:29.387 }, 00:17:29.387 "memory_domains": [ 00:17:29.387 { 00:17:29.387 "dma_device_id": "system", 00:17:29.387 "dma_device_type": 1 00:17:29.387 }, 00:17:29.387 { 00:17:29.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.387 "dma_device_type": 2 00:17:29.387 } 00:17:29.387 ], 00:17:29.387 "driver_specific": {} 00:17:29.387 }' 00:17:29.387 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.387 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.387 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:29.387 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.387 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.646 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:29.646 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:29.646 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:29.646 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:29.646 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:29.646 20:31:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:29.646 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:29.646 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:30.215 [2024-07-15 20:31:22.503429] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.215 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.216 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.216 20:31:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:30.785 20:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.785 "name": "Existed_Raid", 00:17:30.785 "uuid": "466f38f6-7cb1-470a-b372-fe5ed8b0dab2", 00:17:30.785 "strip_size_kb": 0, 00:17:30.785 "state": "online", 00:17:30.785 "raid_level": "raid1", 00:17:30.785 "superblock": true, 00:17:30.785 "num_base_bdevs": 3, 00:17:30.785 "num_base_bdevs_discovered": 2, 00:17:30.785 "num_base_bdevs_operational": 2, 00:17:30.785 "base_bdevs_list": [ 00:17:30.785 { 00:17:30.785 "name": null, 00:17:30.785 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:30.785 "is_configured": false, 00:17:30.785 "data_offset": 2048, 00:17:30.785 "data_size": 63488 00:17:30.785 }, 00:17:30.785 { 00:17:30.785 "name": "BaseBdev2", 00:17:30.785 "uuid": "fefce5cd-24ce-419d-b995-4ddbef26a763", 00:17:30.785 "is_configured": true, 00:17:30.785 "data_offset": 2048, 00:17:30.785 "data_size": 63488 00:17:30.785 }, 00:17:30.785 { 00:17:30.785 "name": "BaseBdev3", 00:17:30.785 "uuid": "7c358ba1-cd92-4e35-a6e0-fa4efed29874", 00:17:30.785 "is_configured": true, 00:17:30.785 "data_offset": 2048, 00:17:30.785 "data_size": 63488 00:17:30.785 } 00:17:30.785 ] 00:17:30.785 }' 00:17:30.785 20:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.785 20:31:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:31.359 20:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:31.359 20:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:31.359 20:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.359 20:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:31.617 20:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:31.617 20:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:31.617 20:31:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:32.185 [2024-07-15 20:31:24.440908] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:32.185 20:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:32.185 20:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:32.185 20:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.185 20:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:32.443 20:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:32.443 20:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:32.443 20:31:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:32.702 [2024-07-15 20:31:24.980065] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:32.702 [2024-07-15 20:31:24.980162] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:32.702 [2024-07-15 20:31:24.999604] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:32.702 [2024-07-15 20:31:24.999638] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:32.702 [2024-07-15 20:31:24.999650] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21f7400 name Existed_Raid, state offline 00:17:32.703 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:32.703 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:32.703 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.703 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:32.962 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:32.962 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:32.962 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:32.962 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:32.962 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:32.962 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:33.222 BaseBdev2 00:17:33.222 20:31:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:33.222 20:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:33.222 20:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:33.222 20:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:33.222 20:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:33.222 20:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:33.222 20:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:33.481 20:31:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:33.739 [ 00:17:33.739 { 00:17:33.739 "name": "BaseBdev2", 00:17:33.739 "aliases": [ 00:17:33.739 "1e6de215-0239-4278-9869-2e05b0e3a893" 00:17:33.739 ], 00:17:33.739 "product_name": "Malloc disk", 00:17:33.739 "block_size": 512, 00:17:33.739 "num_blocks": 65536, 00:17:33.739 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:33.739 "assigned_rate_limits": { 00:17:33.739 "rw_ios_per_sec": 0, 00:17:33.739 "rw_mbytes_per_sec": 0, 00:17:33.739 "r_mbytes_per_sec": 0, 00:17:33.740 "w_mbytes_per_sec": 0 00:17:33.740 }, 00:17:33.740 "claimed": false, 00:17:33.740 "zoned": false, 00:17:33.740 "supported_io_types": { 00:17:33.740 "read": true, 00:17:33.740 "write": true, 00:17:33.740 "unmap": true, 00:17:33.740 "flush": true, 00:17:33.740 "reset": true, 00:17:33.740 "nvme_admin": false, 00:17:33.740 "nvme_io": false, 00:17:33.740 "nvme_io_md": false, 00:17:33.740 "write_zeroes": true, 00:17:33.740 "zcopy": true, 00:17:33.740 "get_zone_info": false, 00:17:33.740 "zone_management": false, 00:17:33.740 "zone_append": false, 00:17:33.740 "compare": false, 00:17:33.740 "compare_and_write": false, 00:17:33.740 "abort": true, 00:17:33.740 "seek_hole": false, 00:17:33.740 "seek_data": false, 00:17:33.740 "copy": true, 00:17:33.740 "nvme_iov_md": false 00:17:33.740 }, 00:17:33.740 "memory_domains": [ 00:17:33.740 { 00:17:33.740 "dma_device_id": "system", 00:17:33.740 "dma_device_type": 1 00:17:33.740 }, 00:17:33.740 { 00:17:33.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.740 "dma_device_type": 2 00:17:33.740 } 00:17:33.740 ], 00:17:33.740 "driver_specific": {} 00:17:33.740 } 00:17:33.740 ] 00:17:33.740 20:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:33.740 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:33.740 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:33.740 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:33.998 BaseBdev3 00:17:33.998 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:33.998 20:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:33.998 20:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:33.998 20:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:33.998 20:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:33.998 20:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:33.998 20:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:34.257 20:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:34.515 [ 00:17:34.515 { 00:17:34.515 "name": "BaseBdev3", 00:17:34.515 "aliases": [ 00:17:34.515 "3c69dcab-93d1-4f21-a97f-e04b02cc1abc" 00:17:34.515 ], 00:17:34.515 "product_name": "Malloc disk", 00:17:34.515 "block_size": 512, 00:17:34.515 "num_blocks": 65536, 00:17:34.515 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:34.515 "assigned_rate_limits": { 00:17:34.515 "rw_ios_per_sec": 0, 00:17:34.515 "rw_mbytes_per_sec": 0, 00:17:34.515 "r_mbytes_per_sec": 0, 00:17:34.515 "w_mbytes_per_sec": 0 00:17:34.515 }, 00:17:34.515 "claimed": false, 00:17:34.515 "zoned": false, 00:17:34.515 "supported_io_types": { 00:17:34.515 "read": true, 00:17:34.515 "write": true, 00:17:34.515 "unmap": true, 00:17:34.515 "flush": true, 00:17:34.515 "reset": true, 00:17:34.515 "nvme_admin": false, 00:17:34.515 "nvme_io": false, 00:17:34.515 "nvme_io_md": false, 00:17:34.515 "write_zeroes": true, 00:17:34.515 "zcopy": true, 00:17:34.515 "get_zone_info": false, 00:17:34.515 "zone_management": false, 00:17:34.515 "zone_append": false, 00:17:34.515 "compare": false, 00:17:34.515 "compare_and_write": false, 00:17:34.515 "abort": true, 00:17:34.515 "seek_hole": false, 00:17:34.515 "seek_data": false, 00:17:34.515 "copy": true, 00:17:34.515 "nvme_iov_md": false 00:17:34.515 }, 00:17:34.515 "memory_domains": [ 00:17:34.515 { 00:17:34.515 "dma_device_id": "system", 00:17:34.515 "dma_device_type": 1 00:17:34.515 }, 00:17:34.515 { 00:17:34.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:34.515 "dma_device_type": 2 00:17:34.515 } 00:17:34.515 ], 00:17:34.515 "driver_specific": {} 00:17:34.515 } 00:17:34.515 ] 00:17:34.515 20:31:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:34.515 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:34.515 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:34.515 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:34.774 [2024-07-15 20:31:26.971064] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:34.774 [2024-07-15 20:31:26.971111] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:34.774 [2024-07-15 20:31:26.971129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:34.774 [2024-07-15 20:31:26.972491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.774 20:31:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.033 20:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.033 "name": "Existed_Raid", 00:17:35.033 "uuid": "2808f6ba-d1fc-4e69-af1f-e1c75e802373", 00:17:35.033 "strip_size_kb": 0, 00:17:35.033 "state": "configuring", 00:17:35.033 "raid_level": "raid1", 00:17:35.033 "superblock": true, 00:17:35.033 "num_base_bdevs": 3, 00:17:35.033 "num_base_bdevs_discovered": 2, 00:17:35.034 "num_base_bdevs_operational": 3, 00:17:35.034 "base_bdevs_list": [ 00:17:35.034 { 00:17:35.034 "name": "BaseBdev1", 00:17:35.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.034 "is_configured": false, 00:17:35.034 "data_offset": 0, 00:17:35.034 "data_size": 0 00:17:35.034 }, 00:17:35.034 { 00:17:35.034 "name": "BaseBdev2", 00:17:35.034 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:35.034 "is_configured": true, 00:17:35.034 "data_offset": 2048, 00:17:35.034 "data_size": 63488 00:17:35.034 }, 00:17:35.034 { 00:17:35.034 "name": "BaseBdev3", 00:17:35.034 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:35.034 "is_configured": true, 00:17:35.034 "data_offset": 2048, 00:17:35.034 "data_size": 63488 00:17:35.034 } 00:17:35.034 ] 00:17:35.034 }' 00:17:35.034 20:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.034 20:31:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:35.600 20:31:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:35.859 [2024-07-15 20:31:28.045897] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.859 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.119 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.119 "name": "Existed_Raid", 00:17:36.119 "uuid": "2808f6ba-d1fc-4e69-af1f-e1c75e802373", 00:17:36.119 "strip_size_kb": 0, 00:17:36.119 "state": "configuring", 00:17:36.119 "raid_level": "raid1", 00:17:36.119 "superblock": true, 00:17:36.119 "num_base_bdevs": 3, 00:17:36.119 "num_base_bdevs_discovered": 1, 00:17:36.119 "num_base_bdevs_operational": 3, 00:17:36.119 "base_bdevs_list": [ 00:17:36.119 { 00:17:36.119 "name": "BaseBdev1", 00:17:36.119 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.119 "is_configured": false, 00:17:36.119 "data_offset": 0, 00:17:36.119 "data_size": 0 00:17:36.119 }, 00:17:36.119 { 00:17:36.119 "name": null, 00:17:36.119 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:36.119 "is_configured": false, 00:17:36.119 "data_offset": 2048, 00:17:36.119 "data_size": 63488 00:17:36.119 }, 00:17:36.119 { 00:17:36.119 "name": "BaseBdev3", 00:17:36.119 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:36.119 "is_configured": true, 00:17:36.119 "data_offset": 2048, 00:17:36.119 "data_size": 63488 00:17:36.119 } 00:17:36.119 ] 00:17:36.119 }' 00:17:36.119 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.119 20:31:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:36.686 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.686 20:31:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:36.945 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:36.945 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:37.242 [2024-07-15 20:31:29.442558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:37.242 BaseBdev1 00:17:37.242 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:37.242 20:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:37.242 20:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:37.242 20:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:37.242 20:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:37.242 20:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:37.242 20:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:37.513 20:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:37.772 [ 00:17:37.772 { 00:17:37.772 "name": "BaseBdev1", 00:17:37.772 "aliases": [ 00:17:37.772 "c901f625-2404-4cbf-9b68-3fbd77968ed9" 00:17:37.772 ], 00:17:37.772 "product_name": "Malloc disk", 00:17:37.772 "block_size": 512, 00:17:37.772 "num_blocks": 65536, 00:17:37.772 "uuid": "c901f625-2404-4cbf-9b68-3fbd77968ed9", 00:17:37.772 "assigned_rate_limits": { 00:17:37.772 "rw_ios_per_sec": 0, 00:17:37.772 "rw_mbytes_per_sec": 0, 00:17:37.772 "r_mbytes_per_sec": 0, 00:17:37.772 "w_mbytes_per_sec": 0 00:17:37.772 }, 00:17:37.772 "claimed": true, 00:17:37.772 "claim_type": "exclusive_write", 00:17:37.772 "zoned": false, 00:17:37.772 "supported_io_types": { 00:17:37.772 "read": true, 00:17:37.772 "write": true, 00:17:37.772 "unmap": true, 00:17:37.772 "flush": true, 00:17:37.772 "reset": true, 00:17:37.772 "nvme_admin": false, 00:17:37.772 "nvme_io": false, 00:17:37.772 "nvme_io_md": false, 00:17:37.772 "write_zeroes": true, 00:17:37.772 "zcopy": true, 00:17:37.772 "get_zone_info": false, 00:17:37.772 "zone_management": false, 00:17:37.772 "zone_append": false, 00:17:37.772 "compare": false, 00:17:37.772 "compare_and_write": false, 00:17:37.772 "abort": true, 00:17:37.772 "seek_hole": false, 00:17:37.772 "seek_data": false, 00:17:37.772 "copy": true, 00:17:37.773 "nvme_iov_md": false 00:17:37.773 }, 00:17:37.773 "memory_domains": [ 00:17:37.773 { 00:17:37.773 "dma_device_id": "system", 00:17:37.773 "dma_device_type": 1 00:17:37.773 }, 00:17:37.773 { 00:17:37.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.773 "dma_device_type": 2 00:17:37.773 } 00:17:37.773 ], 00:17:37.773 "driver_specific": {} 00:17:37.773 } 00:17:37.773 ] 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.773 20:31:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.033 20:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.033 "name": "Existed_Raid", 00:17:38.033 "uuid": "2808f6ba-d1fc-4e69-af1f-e1c75e802373", 00:17:38.033 "strip_size_kb": 0, 00:17:38.033 "state": "configuring", 00:17:38.033 "raid_level": "raid1", 00:17:38.033 "superblock": true, 00:17:38.033 "num_base_bdevs": 3, 00:17:38.033 "num_base_bdevs_discovered": 2, 00:17:38.033 "num_base_bdevs_operational": 3, 00:17:38.033 "base_bdevs_list": [ 00:17:38.033 { 00:17:38.033 "name": "BaseBdev1", 00:17:38.033 "uuid": "c901f625-2404-4cbf-9b68-3fbd77968ed9", 00:17:38.033 "is_configured": true, 00:17:38.033 "data_offset": 2048, 00:17:38.033 "data_size": 63488 00:17:38.033 }, 00:17:38.033 { 00:17:38.033 "name": null, 00:17:38.033 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:38.033 "is_configured": false, 00:17:38.033 "data_offset": 2048, 00:17:38.033 "data_size": 63488 00:17:38.033 }, 00:17:38.033 { 00:17:38.033 "name": "BaseBdev3", 00:17:38.033 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:38.033 "is_configured": true, 00:17:38.033 "data_offset": 2048, 00:17:38.033 "data_size": 63488 00:17:38.033 } 00:17:38.033 ] 00:17:38.033 }' 00:17:38.033 20:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.033 20:31:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:38.602 20:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.602 20:31:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:38.861 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:38.861 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:39.121 [2024-07-15 20:31:31.275452] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.121 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.381 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.381 "name": "Existed_Raid", 00:17:39.381 "uuid": "2808f6ba-d1fc-4e69-af1f-e1c75e802373", 00:17:39.381 "strip_size_kb": 0, 00:17:39.381 "state": "configuring", 00:17:39.381 "raid_level": "raid1", 00:17:39.381 "superblock": true, 00:17:39.381 "num_base_bdevs": 3, 00:17:39.381 "num_base_bdevs_discovered": 1, 00:17:39.381 "num_base_bdevs_operational": 3, 00:17:39.381 "base_bdevs_list": [ 00:17:39.381 { 00:17:39.381 "name": "BaseBdev1", 00:17:39.381 "uuid": "c901f625-2404-4cbf-9b68-3fbd77968ed9", 00:17:39.381 "is_configured": true, 00:17:39.381 "data_offset": 2048, 00:17:39.381 "data_size": 63488 00:17:39.381 }, 00:17:39.381 { 00:17:39.381 "name": null, 00:17:39.381 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:39.381 "is_configured": false, 00:17:39.381 "data_offset": 2048, 00:17:39.381 "data_size": 63488 00:17:39.381 }, 00:17:39.381 { 00:17:39.381 "name": null, 00:17:39.381 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:39.381 "is_configured": false, 00:17:39.381 "data_offset": 2048, 00:17:39.381 "data_size": 63488 00:17:39.381 } 00:17:39.381 ] 00:17:39.381 }' 00:17:39.381 20:31:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.381 20:31:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:39.950 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.950 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:40.209 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:40.209 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:40.468 [2024-07-15 20:31:32.639097] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.468 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:40.728 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.728 "name": "Existed_Raid", 00:17:40.728 "uuid": "2808f6ba-d1fc-4e69-af1f-e1c75e802373", 00:17:40.728 "strip_size_kb": 0, 00:17:40.728 "state": "configuring", 00:17:40.728 "raid_level": "raid1", 00:17:40.728 "superblock": true, 00:17:40.728 "num_base_bdevs": 3, 00:17:40.728 "num_base_bdevs_discovered": 2, 00:17:40.728 "num_base_bdevs_operational": 3, 00:17:40.728 "base_bdevs_list": [ 00:17:40.728 { 00:17:40.728 "name": "BaseBdev1", 00:17:40.728 "uuid": "c901f625-2404-4cbf-9b68-3fbd77968ed9", 00:17:40.728 "is_configured": true, 00:17:40.728 "data_offset": 2048, 00:17:40.728 "data_size": 63488 00:17:40.728 }, 00:17:40.728 { 00:17:40.728 "name": null, 00:17:40.728 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:40.728 "is_configured": false, 00:17:40.728 "data_offset": 2048, 00:17:40.728 "data_size": 63488 00:17:40.728 }, 00:17:40.728 { 00:17:40.728 "name": "BaseBdev3", 00:17:40.728 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:40.728 "is_configured": true, 00:17:40.728 "data_offset": 2048, 00:17:40.728 "data_size": 63488 00:17:40.728 } 00:17:40.728 ] 00:17:40.728 }' 00:17:40.728 20:31:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.728 20:31:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:41.293 20:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.293 20:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:41.551 20:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:41.551 20:31:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:41.809 [2024-07-15 20:31:33.986685] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.809 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.068 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.068 "name": "Existed_Raid", 00:17:42.068 "uuid": "2808f6ba-d1fc-4e69-af1f-e1c75e802373", 00:17:42.068 "strip_size_kb": 0, 00:17:42.068 "state": "configuring", 00:17:42.068 "raid_level": "raid1", 00:17:42.068 "superblock": true, 00:17:42.068 "num_base_bdevs": 3, 00:17:42.068 "num_base_bdevs_discovered": 1, 00:17:42.068 "num_base_bdevs_operational": 3, 00:17:42.068 "base_bdevs_list": [ 00:17:42.068 { 00:17:42.068 "name": null, 00:17:42.068 "uuid": "c901f625-2404-4cbf-9b68-3fbd77968ed9", 00:17:42.068 "is_configured": false, 00:17:42.068 "data_offset": 2048, 00:17:42.068 "data_size": 63488 00:17:42.068 }, 00:17:42.068 { 00:17:42.068 "name": null, 00:17:42.068 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:42.068 "is_configured": false, 00:17:42.068 "data_offset": 2048, 00:17:42.068 "data_size": 63488 00:17:42.068 }, 00:17:42.068 { 00:17:42.068 "name": "BaseBdev3", 00:17:42.068 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:42.068 "is_configured": true, 00:17:42.068 "data_offset": 2048, 00:17:42.068 "data_size": 63488 00:17:42.068 } 00:17:42.068 ] 00:17:42.068 }' 00:17:42.068 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.068 20:31:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:42.635 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.635 20:31:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:42.893 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:42.893 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:43.151 [2024-07-15 20:31:35.321963] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.151 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:43.409 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:43.409 "name": "Existed_Raid", 00:17:43.409 "uuid": "2808f6ba-d1fc-4e69-af1f-e1c75e802373", 00:17:43.409 "strip_size_kb": 0, 00:17:43.409 "state": "configuring", 00:17:43.410 "raid_level": "raid1", 00:17:43.410 "superblock": true, 00:17:43.410 "num_base_bdevs": 3, 00:17:43.410 "num_base_bdevs_discovered": 2, 00:17:43.410 "num_base_bdevs_operational": 3, 00:17:43.410 "base_bdevs_list": [ 00:17:43.410 { 00:17:43.410 "name": null, 00:17:43.410 "uuid": "c901f625-2404-4cbf-9b68-3fbd77968ed9", 00:17:43.410 "is_configured": false, 00:17:43.410 "data_offset": 2048, 00:17:43.410 "data_size": 63488 00:17:43.410 }, 00:17:43.410 { 00:17:43.410 "name": "BaseBdev2", 00:17:43.410 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:43.410 "is_configured": true, 00:17:43.410 "data_offset": 2048, 00:17:43.410 "data_size": 63488 00:17:43.410 }, 00:17:43.410 { 00:17:43.410 "name": "BaseBdev3", 00:17:43.410 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:43.410 "is_configured": true, 00:17:43.410 "data_offset": 2048, 00:17:43.410 "data_size": 63488 00:17:43.410 } 00:17:43.410 ] 00:17:43.410 }' 00:17:43.410 20:31:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:43.410 20:31:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:43.976 20:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.976 20:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:44.234 20:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:44.234 20:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.234 20:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:44.492 20:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c901f625-2404-4cbf-9b68-3fbd77968ed9 00:17:44.750 [2024-07-15 20:31:36.903106] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:44.750 [2024-07-15 20:31:36.903285] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21ed1b0 00:17:44.750 [2024-07-15 20:31:36.903299] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:44.750 [2024-07-15 20:31:36.903489] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23a94f0 00:17:44.750 [2024-07-15 20:31:36.903623] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21ed1b0 00:17:44.750 [2024-07-15 20:31:36.903633] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21ed1b0 00:17:44.750 [2024-07-15 20:31:36.903734] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:44.750 NewBaseBdev 00:17:44.750 20:31:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:44.750 20:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:44.750 20:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:44.750 20:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:44.750 20:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:44.750 20:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:44.750 20:31:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:45.008 20:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:45.266 [ 00:17:45.266 { 00:17:45.266 "name": "NewBaseBdev", 00:17:45.266 "aliases": [ 00:17:45.266 "c901f625-2404-4cbf-9b68-3fbd77968ed9" 00:17:45.266 ], 00:17:45.266 "product_name": "Malloc disk", 00:17:45.266 "block_size": 512, 00:17:45.266 "num_blocks": 65536, 00:17:45.266 "uuid": "c901f625-2404-4cbf-9b68-3fbd77968ed9", 00:17:45.266 "assigned_rate_limits": { 00:17:45.266 "rw_ios_per_sec": 0, 00:17:45.266 "rw_mbytes_per_sec": 0, 00:17:45.266 "r_mbytes_per_sec": 0, 00:17:45.266 "w_mbytes_per_sec": 0 00:17:45.266 }, 00:17:45.266 "claimed": true, 00:17:45.266 "claim_type": "exclusive_write", 00:17:45.266 "zoned": false, 00:17:45.266 "supported_io_types": { 00:17:45.266 "read": true, 00:17:45.266 "write": true, 00:17:45.266 "unmap": true, 00:17:45.266 "flush": true, 00:17:45.266 "reset": true, 00:17:45.266 "nvme_admin": false, 00:17:45.266 "nvme_io": false, 00:17:45.266 "nvme_io_md": false, 00:17:45.266 "write_zeroes": true, 00:17:45.266 "zcopy": true, 00:17:45.266 "get_zone_info": false, 00:17:45.266 "zone_management": false, 00:17:45.266 "zone_append": false, 00:17:45.266 "compare": false, 00:17:45.266 "compare_and_write": false, 00:17:45.266 "abort": true, 00:17:45.266 "seek_hole": false, 00:17:45.266 "seek_data": false, 00:17:45.266 "copy": true, 00:17:45.266 "nvme_iov_md": false 00:17:45.266 }, 00:17:45.266 "memory_domains": [ 00:17:45.266 { 00:17:45.266 "dma_device_id": "system", 00:17:45.266 "dma_device_type": 1 00:17:45.266 }, 00:17:45.266 { 00:17:45.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.266 "dma_device_type": 2 00:17:45.266 } 00:17:45.266 ], 00:17:45.266 "driver_specific": {} 00:17:45.266 } 00:17:45.266 ] 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.266 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:45.524 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.524 "name": "Existed_Raid", 00:17:45.524 "uuid": "2808f6ba-d1fc-4e69-af1f-e1c75e802373", 00:17:45.524 "strip_size_kb": 0, 00:17:45.524 "state": "online", 00:17:45.524 "raid_level": "raid1", 00:17:45.524 "superblock": true, 00:17:45.524 "num_base_bdevs": 3, 00:17:45.524 "num_base_bdevs_discovered": 3, 00:17:45.524 "num_base_bdevs_operational": 3, 00:17:45.524 "base_bdevs_list": [ 00:17:45.524 { 00:17:45.524 "name": "NewBaseBdev", 00:17:45.524 "uuid": "c901f625-2404-4cbf-9b68-3fbd77968ed9", 00:17:45.524 "is_configured": true, 00:17:45.524 "data_offset": 2048, 00:17:45.524 "data_size": 63488 00:17:45.524 }, 00:17:45.524 { 00:17:45.524 "name": "BaseBdev2", 00:17:45.524 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:45.524 "is_configured": true, 00:17:45.524 "data_offset": 2048, 00:17:45.524 "data_size": 63488 00:17:45.524 }, 00:17:45.524 { 00:17:45.524 "name": "BaseBdev3", 00:17:45.524 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:45.524 "is_configured": true, 00:17:45.524 "data_offset": 2048, 00:17:45.524 "data_size": 63488 00:17:45.524 } 00:17:45.524 ] 00:17:45.524 }' 00:17:45.524 20:31:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.524 20:31:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:46.091 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:46.091 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:46.091 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:46.091 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:46.091 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:46.091 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:46.091 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:46.091 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:46.349 [2024-07-15 20:31:38.483585] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:46.349 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:46.349 "name": "Existed_Raid", 00:17:46.349 "aliases": [ 00:17:46.349 "2808f6ba-d1fc-4e69-af1f-e1c75e802373" 00:17:46.349 ], 00:17:46.349 "product_name": "Raid Volume", 00:17:46.349 "block_size": 512, 00:17:46.349 "num_blocks": 63488, 00:17:46.349 "uuid": "2808f6ba-d1fc-4e69-af1f-e1c75e802373", 00:17:46.349 "assigned_rate_limits": { 00:17:46.349 "rw_ios_per_sec": 0, 00:17:46.349 "rw_mbytes_per_sec": 0, 00:17:46.349 "r_mbytes_per_sec": 0, 00:17:46.349 "w_mbytes_per_sec": 0 00:17:46.349 }, 00:17:46.349 "claimed": false, 00:17:46.349 "zoned": false, 00:17:46.349 "supported_io_types": { 00:17:46.349 "read": true, 00:17:46.349 "write": true, 00:17:46.349 "unmap": false, 00:17:46.349 "flush": false, 00:17:46.349 "reset": true, 00:17:46.349 "nvme_admin": false, 00:17:46.349 "nvme_io": false, 00:17:46.349 "nvme_io_md": false, 00:17:46.349 "write_zeroes": true, 00:17:46.349 "zcopy": false, 00:17:46.349 "get_zone_info": false, 00:17:46.349 "zone_management": false, 00:17:46.349 "zone_append": false, 00:17:46.349 "compare": false, 00:17:46.349 "compare_and_write": false, 00:17:46.349 "abort": false, 00:17:46.349 "seek_hole": false, 00:17:46.349 "seek_data": false, 00:17:46.349 "copy": false, 00:17:46.349 "nvme_iov_md": false 00:17:46.349 }, 00:17:46.349 "memory_domains": [ 00:17:46.349 { 00:17:46.349 "dma_device_id": "system", 00:17:46.349 "dma_device_type": 1 00:17:46.349 }, 00:17:46.349 { 00:17:46.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.349 "dma_device_type": 2 00:17:46.349 }, 00:17:46.349 { 00:17:46.349 "dma_device_id": "system", 00:17:46.349 "dma_device_type": 1 00:17:46.349 }, 00:17:46.349 { 00:17:46.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.349 "dma_device_type": 2 00:17:46.349 }, 00:17:46.349 { 00:17:46.349 "dma_device_id": "system", 00:17:46.349 "dma_device_type": 1 00:17:46.349 }, 00:17:46.349 { 00:17:46.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.349 "dma_device_type": 2 00:17:46.349 } 00:17:46.349 ], 00:17:46.349 "driver_specific": { 00:17:46.349 "raid": { 00:17:46.349 "uuid": "2808f6ba-d1fc-4e69-af1f-e1c75e802373", 00:17:46.349 "strip_size_kb": 0, 00:17:46.349 "state": "online", 00:17:46.349 "raid_level": "raid1", 00:17:46.349 "superblock": true, 00:17:46.349 "num_base_bdevs": 3, 00:17:46.349 "num_base_bdevs_discovered": 3, 00:17:46.349 "num_base_bdevs_operational": 3, 00:17:46.349 "base_bdevs_list": [ 00:17:46.349 { 00:17:46.349 "name": "NewBaseBdev", 00:17:46.349 "uuid": "c901f625-2404-4cbf-9b68-3fbd77968ed9", 00:17:46.349 "is_configured": true, 00:17:46.349 "data_offset": 2048, 00:17:46.349 "data_size": 63488 00:17:46.349 }, 00:17:46.349 { 00:17:46.349 "name": "BaseBdev2", 00:17:46.349 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:46.349 "is_configured": true, 00:17:46.349 "data_offset": 2048, 00:17:46.349 "data_size": 63488 00:17:46.349 }, 00:17:46.349 { 00:17:46.349 "name": "BaseBdev3", 00:17:46.349 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:46.349 "is_configured": true, 00:17:46.349 "data_offset": 2048, 00:17:46.349 "data_size": 63488 00:17:46.349 } 00:17:46.349 ] 00:17:46.349 } 00:17:46.349 } 00:17:46.350 }' 00:17:46.350 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:46.350 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:46.350 BaseBdev2 00:17:46.350 BaseBdev3' 00:17:46.350 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:46.350 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:46.350 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:46.633 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:46.633 "name": "NewBaseBdev", 00:17:46.633 "aliases": [ 00:17:46.633 "c901f625-2404-4cbf-9b68-3fbd77968ed9" 00:17:46.633 ], 00:17:46.633 "product_name": "Malloc disk", 00:17:46.633 "block_size": 512, 00:17:46.633 "num_blocks": 65536, 00:17:46.633 "uuid": "c901f625-2404-4cbf-9b68-3fbd77968ed9", 00:17:46.633 "assigned_rate_limits": { 00:17:46.633 "rw_ios_per_sec": 0, 00:17:46.633 "rw_mbytes_per_sec": 0, 00:17:46.633 "r_mbytes_per_sec": 0, 00:17:46.633 "w_mbytes_per_sec": 0 00:17:46.633 }, 00:17:46.633 "claimed": true, 00:17:46.634 "claim_type": "exclusive_write", 00:17:46.634 "zoned": false, 00:17:46.634 "supported_io_types": { 00:17:46.634 "read": true, 00:17:46.634 "write": true, 00:17:46.634 "unmap": true, 00:17:46.634 "flush": true, 00:17:46.634 "reset": true, 00:17:46.634 "nvme_admin": false, 00:17:46.634 "nvme_io": false, 00:17:46.634 "nvme_io_md": false, 00:17:46.634 "write_zeroes": true, 00:17:46.634 "zcopy": true, 00:17:46.634 "get_zone_info": false, 00:17:46.634 "zone_management": false, 00:17:46.634 "zone_append": false, 00:17:46.634 "compare": false, 00:17:46.634 "compare_and_write": false, 00:17:46.634 "abort": true, 00:17:46.634 "seek_hole": false, 00:17:46.634 "seek_data": false, 00:17:46.634 "copy": true, 00:17:46.634 "nvme_iov_md": false 00:17:46.634 }, 00:17:46.634 "memory_domains": [ 00:17:46.634 { 00:17:46.634 "dma_device_id": "system", 00:17:46.634 "dma_device_type": 1 00:17:46.634 }, 00:17:46.634 { 00:17:46.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.634 "dma_device_type": 2 00:17:46.634 } 00:17:46.634 ], 00:17:46.634 "driver_specific": {} 00:17:46.634 }' 00:17:46.634 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.634 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.634 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:46.634 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.634 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.634 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:46.634 20:31:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.892 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.892 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.893 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.893 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.893 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.893 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:46.893 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:46.893 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.151 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.151 "name": "BaseBdev2", 00:17:47.151 "aliases": [ 00:17:47.151 "1e6de215-0239-4278-9869-2e05b0e3a893" 00:17:47.151 ], 00:17:47.151 "product_name": "Malloc disk", 00:17:47.151 "block_size": 512, 00:17:47.151 "num_blocks": 65536, 00:17:47.151 "uuid": "1e6de215-0239-4278-9869-2e05b0e3a893", 00:17:47.151 "assigned_rate_limits": { 00:17:47.151 "rw_ios_per_sec": 0, 00:17:47.151 "rw_mbytes_per_sec": 0, 00:17:47.151 "r_mbytes_per_sec": 0, 00:17:47.151 "w_mbytes_per_sec": 0 00:17:47.151 }, 00:17:47.151 "claimed": true, 00:17:47.151 "claim_type": "exclusive_write", 00:17:47.151 "zoned": false, 00:17:47.151 "supported_io_types": { 00:17:47.151 "read": true, 00:17:47.151 "write": true, 00:17:47.151 "unmap": true, 00:17:47.151 "flush": true, 00:17:47.151 "reset": true, 00:17:47.151 "nvme_admin": false, 00:17:47.151 "nvme_io": false, 00:17:47.151 "nvme_io_md": false, 00:17:47.151 "write_zeroes": true, 00:17:47.151 "zcopy": true, 00:17:47.151 "get_zone_info": false, 00:17:47.151 "zone_management": false, 00:17:47.151 "zone_append": false, 00:17:47.151 "compare": false, 00:17:47.151 "compare_and_write": false, 00:17:47.151 "abort": true, 00:17:47.151 "seek_hole": false, 00:17:47.151 "seek_data": false, 00:17:47.151 "copy": true, 00:17:47.151 "nvme_iov_md": false 00:17:47.151 }, 00:17:47.151 "memory_domains": [ 00:17:47.151 { 00:17:47.151 "dma_device_id": "system", 00:17:47.151 "dma_device_type": 1 00:17:47.151 }, 00:17:47.151 { 00:17:47.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.151 "dma_device_type": 2 00:17:47.151 } 00:17:47.151 ], 00:17:47.151 "driver_specific": {} 00:17:47.151 }' 00:17:47.151 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.151 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.151 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.151 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.151 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.410 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.410 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.410 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.410 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.410 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.410 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.410 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.410 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.410 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.410 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:47.670 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.670 "name": "BaseBdev3", 00:17:47.670 "aliases": [ 00:17:47.670 "3c69dcab-93d1-4f21-a97f-e04b02cc1abc" 00:17:47.670 ], 00:17:47.670 "product_name": "Malloc disk", 00:17:47.670 "block_size": 512, 00:17:47.670 "num_blocks": 65536, 00:17:47.670 "uuid": "3c69dcab-93d1-4f21-a97f-e04b02cc1abc", 00:17:47.670 "assigned_rate_limits": { 00:17:47.670 "rw_ios_per_sec": 0, 00:17:47.670 "rw_mbytes_per_sec": 0, 00:17:47.670 "r_mbytes_per_sec": 0, 00:17:47.670 "w_mbytes_per_sec": 0 00:17:47.670 }, 00:17:47.670 "claimed": true, 00:17:47.670 "claim_type": "exclusive_write", 00:17:47.670 "zoned": false, 00:17:47.670 "supported_io_types": { 00:17:47.670 "read": true, 00:17:47.670 "write": true, 00:17:47.670 "unmap": true, 00:17:47.670 "flush": true, 00:17:47.670 "reset": true, 00:17:47.670 "nvme_admin": false, 00:17:47.670 "nvme_io": false, 00:17:47.670 "nvme_io_md": false, 00:17:47.670 "write_zeroes": true, 00:17:47.670 "zcopy": true, 00:17:47.670 "get_zone_info": false, 00:17:47.670 "zone_management": false, 00:17:47.670 "zone_append": false, 00:17:47.670 "compare": false, 00:17:47.670 "compare_and_write": false, 00:17:47.670 "abort": true, 00:17:47.670 "seek_hole": false, 00:17:47.670 "seek_data": false, 00:17:47.670 "copy": true, 00:17:47.670 "nvme_iov_md": false 00:17:47.670 }, 00:17:47.670 "memory_domains": [ 00:17:47.670 { 00:17:47.670 "dma_device_id": "system", 00:17:47.670 "dma_device_type": 1 00:17:47.670 }, 00:17:47.670 { 00:17:47.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.670 "dma_device_type": 2 00:17:47.670 } 00:17:47.670 ], 00:17:47.670 "driver_specific": {} 00:17:47.670 }' 00:17:47.670 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.670 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.670 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.670 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.670 20:31:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.670 20:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.670 20:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.929 20:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.929 20:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.929 20:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.929 20:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.929 20:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.929 20:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:48.188 [2024-07-15 20:31:40.444520] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:48.188 [2024-07-15 20:31:40.444548] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:48.188 [2024-07-15 20:31:40.444607] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:48.188 [2024-07-15 20:31:40.444896] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:48.188 [2024-07-15 20:31:40.444909] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21ed1b0 name Existed_Raid, state offline 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1402327 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1402327 ']' 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1402327 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1402327 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1402327' 00:17:48.188 killing process with pid 1402327 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1402327 00:17:48.188 [2024-07-15 20:31:40.523267] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:48.188 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1402327 00:17:48.447 [2024-07-15 20:31:40.576937] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:48.706 20:31:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:48.706 00:17:48.706 real 0m33.679s 00:17:48.706 user 1m1.551s 00:17:48.706 sys 0m5.804s 00:17:48.706 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:48.706 20:31:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:48.706 ************************************ 00:17:48.707 END TEST raid_state_function_test_sb 00:17:48.707 ************************************ 00:17:48.707 20:31:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:48.707 20:31:41 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:17:48.707 20:31:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:48.707 20:31:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:48.707 20:31:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:48.707 ************************************ 00:17:48.707 START TEST raid_superblock_test 00:17:48.707 ************************************ 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1407318 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1407318 /var/tmp/spdk-raid.sock 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1407318 ']' 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:48.707 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:48.707 20:31:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:48.966 [2024-07-15 20:31:41.131240] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:17:48.966 [2024-07-15 20:31:41.131313] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1407318 ] 00:17:48.966 [2024-07-15 20:31:41.261287] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:49.225 [2024-07-15 20:31:41.364159] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:49.225 [2024-07-15 20:31:41.418700] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:49.225 [2024-07-15 20:31:41.418734] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:49.794 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:50.053 malloc1 00:17:50.053 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:50.312 [2024-07-15 20:31:42.533780] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:50.312 [2024-07-15 20:31:42.533827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:50.312 [2024-07-15 20:31:42.533849] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1efc570 00:17:50.312 [2024-07-15 20:31:42.533861] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:50.312 [2024-07-15 20:31:42.535617] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:50.312 [2024-07-15 20:31:42.535646] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:50.313 pt1 00:17:50.313 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:50.313 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:50.313 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:50.313 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:50.313 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:50.313 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:50.313 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:50.313 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:50.313 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:50.572 malloc2 00:17:50.572 20:31:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:50.831 [2024-07-15 20:31:43.029099] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:50.831 [2024-07-15 20:31:43.029144] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:50.831 [2024-07-15 20:31:43.029162] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1efd970 00:17:50.831 [2024-07-15 20:31:43.029175] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:50.831 [2024-07-15 20:31:43.030788] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:50.831 [2024-07-15 20:31:43.030815] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:50.831 pt2 00:17:50.831 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:50.831 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:50.831 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:50.831 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:50.831 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:50.831 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:50.831 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:50.831 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:50.831 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:51.095 malloc3 00:17:51.095 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:51.355 [2024-07-15 20:31:43.514995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:51.355 [2024-07-15 20:31:43.515042] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:51.355 [2024-07-15 20:31:43.515061] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2094340 00:17:51.355 [2024-07-15 20:31:43.515074] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:51.355 [2024-07-15 20:31:43.516605] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:51.355 [2024-07-15 20:31:43.516633] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:51.355 pt3 00:17:51.355 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:51.355 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:51.355 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:17:51.614 [2024-07-15 20:31:43.755641] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:51.614 [2024-07-15 20:31:43.756974] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:51.614 [2024-07-15 20:31:43.757029] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:51.614 [2024-07-15 20:31:43.757181] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ef4ea0 00:17:51.614 [2024-07-15 20:31:43.757193] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:51.614 [2024-07-15 20:31:43.757395] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1efc240 00:17:51.614 [2024-07-15 20:31:43.757544] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ef4ea0 00:17:51.614 [2024-07-15 20:31:43.757555] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ef4ea0 00:17:51.614 [2024-07-15 20:31:43.757653] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.614 20:31:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:51.873 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:51.873 "name": "raid_bdev1", 00:17:51.873 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:17:51.873 "strip_size_kb": 0, 00:17:51.873 "state": "online", 00:17:51.873 "raid_level": "raid1", 00:17:51.873 "superblock": true, 00:17:51.873 "num_base_bdevs": 3, 00:17:51.873 "num_base_bdevs_discovered": 3, 00:17:51.873 "num_base_bdevs_operational": 3, 00:17:51.873 "base_bdevs_list": [ 00:17:51.873 { 00:17:51.873 "name": "pt1", 00:17:51.873 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:51.873 "is_configured": true, 00:17:51.873 "data_offset": 2048, 00:17:51.873 "data_size": 63488 00:17:51.873 }, 00:17:51.873 { 00:17:51.873 "name": "pt2", 00:17:51.873 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:51.873 "is_configured": true, 00:17:51.873 "data_offset": 2048, 00:17:51.873 "data_size": 63488 00:17:51.873 }, 00:17:51.873 { 00:17:51.873 "name": "pt3", 00:17:51.873 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:51.873 "is_configured": true, 00:17:51.873 "data_offset": 2048, 00:17:51.873 "data_size": 63488 00:17:51.873 } 00:17:51.873 ] 00:17:51.873 }' 00:17:51.873 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:51.873 20:31:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.441 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:52.441 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:52.441 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:52.441 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:52.441 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:52.441 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:52.441 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:52.441 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:52.441 [2024-07-15 20:31:44.794640] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:52.700 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:52.700 "name": "raid_bdev1", 00:17:52.700 "aliases": [ 00:17:52.700 "0aeb0bd9-2185-4440-9084-00c436447ae1" 00:17:52.700 ], 00:17:52.700 "product_name": "Raid Volume", 00:17:52.700 "block_size": 512, 00:17:52.700 "num_blocks": 63488, 00:17:52.700 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:17:52.700 "assigned_rate_limits": { 00:17:52.700 "rw_ios_per_sec": 0, 00:17:52.700 "rw_mbytes_per_sec": 0, 00:17:52.700 "r_mbytes_per_sec": 0, 00:17:52.700 "w_mbytes_per_sec": 0 00:17:52.700 }, 00:17:52.700 "claimed": false, 00:17:52.700 "zoned": false, 00:17:52.700 "supported_io_types": { 00:17:52.700 "read": true, 00:17:52.700 "write": true, 00:17:52.700 "unmap": false, 00:17:52.700 "flush": false, 00:17:52.700 "reset": true, 00:17:52.700 "nvme_admin": false, 00:17:52.700 "nvme_io": false, 00:17:52.700 "nvme_io_md": false, 00:17:52.700 "write_zeroes": true, 00:17:52.700 "zcopy": false, 00:17:52.700 "get_zone_info": false, 00:17:52.700 "zone_management": false, 00:17:52.700 "zone_append": false, 00:17:52.700 "compare": false, 00:17:52.700 "compare_and_write": false, 00:17:52.700 "abort": false, 00:17:52.700 "seek_hole": false, 00:17:52.700 "seek_data": false, 00:17:52.700 "copy": false, 00:17:52.700 "nvme_iov_md": false 00:17:52.700 }, 00:17:52.700 "memory_domains": [ 00:17:52.700 { 00:17:52.700 "dma_device_id": "system", 00:17:52.700 "dma_device_type": 1 00:17:52.700 }, 00:17:52.700 { 00:17:52.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.700 "dma_device_type": 2 00:17:52.700 }, 00:17:52.700 { 00:17:52.700 "dma_device_id": "system", 00:17:52.700 "dma_device_type": 1 00:17:52.700 }, 00:17:52.700 { 00:17:52.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.700 "dma_device_type": 2 00:17:52.700 }, 00:17:52.700 { 00:17:52.700 "dma_device_id": "system", 00:17:52.700 "dma_device_type": 1 00:17:52.700 }, 00:17:52.700 { 00:17:52.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.700 "dma_device_type": 2 00:17:52.700 } 00:17:52.700 ], 00:17:52.700 "driver_specific": { 00:17:52.700 "raid": { 00:17:52.700 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:17:52.700 "strip_size_kb": 0, 00:17:52.700 "state": "online", 00:17:52.700 "raid_level": "raid1", 00:17:52.700 "superblock": true, 00:17:52.700 "num_base_bdevs": 3, 00:17:52.700 "num_base_bdevs_discovered": 3, 00:17:52.700 "num_base_bdevs_operational": 3, 00:17:52.700 "base_bdevs_list": [ 00:17:52.700 { 00:17:52.700 "name": "pt1", 00:17:52.700 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:52.700 "is_configured": true, 00:17:52.700 "data_offset": 2048, 00:17:52.700 "data_size": 63488 00:17:52.700 }, 00:17:52.700 { 00:17:52.700 "name": "pt2", 00:17:52.700 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:52.700 "is_configured": true, 00:17:52.700 "data_offset": 2048, 00:17:52.700 "data_size": 63488 00:17:52.700 }, 00:17:52.700 { 00:17:52.700 "name": "pt3", 00:17:52.700 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:52.700 "is_configured": true, 00:17:52.700 "data_offset": 2048, 00:17:52.700 "data_size": 63488 00:17:52.700 } 00:17:52.700 ] 00:17:52.700 } 00:17:52.700 } 00:17:52.700 }' 00:17:52.700 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:52.700 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:52.700 pt2 00:17:52.700 pt3' 00:17:52.700 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:52.700 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:52.700 20:31:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:52.960 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:52.960 "name": "pt1", 00:17:52.960 "aliases": [ 00:17:52.960 "00000000-0000-0000-0000-000000000001" 00:17:52.960 ], 00:17:52.960 "product_name": "passthru", 00:17:52.960 "block_size": 512, 00:17:52.960 "num_blocks": 65536, 00:17:52.960 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:52.960 "assigned_rate_limits": { 00:17:52.960 "rw_ios_per_sec": 0, 00:17:52.960 "rw_mbytes_per_sec": 0, 00:17:52.960 "r_mbytes_per_sec": 0, 00:17:52.960 "w_mbytes_per_sec": 0 00:17:52.960 }, 00:17:52.960 "claimed": true, 00:17:52.960 "claim_type": "exclusive_write", 00:17:52.960 "zoned": false, 00:17:52.960 "supported_io_types": { 00:17:52.960 "read": true, 00:17:52.960 "write": true, 00:17:52.960 "unmap": true, 00:17:52.960 "flush": true, 00:17:52.960 "reset": true, 00:17:52.960 "nvme_admin": false, 00:17:52.960 "nvme_io": false, 00:17:52.960 "nvme_io_md": false, 00:17:52.960 "write_zeroes": true, 00:17:52.960 "zcopy": true, 00:17:52.960 "get_zone_info": false, 00:17:52.960 "zone_management": false, 00:17:52.960 "zone_append": false, 00:17:52.960 "compare": false, 00:17:52.960 "compare_and_write": false, 00:17:52.960 "abort": true, 00:17:52.960 "seek_hole": false, 00:17:52.960 "seek_data": false, 00:17:52.960 "copy": true, 00:17:52.960 "nvme_iov_md": false 00:17:52.960 }, 00:17:52.960 "memory_domains": [ 00:17:52.960 { 00:17:52.960 "dma_device_id": "system", 00:17:52.960 "dma_device_type": 1 00:17:52.960 }, 00:17:52.960 { 00:17:52.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.960 "dma_device_type": 2 00:17:52.960 } 00:17:52.960 ], 00:17:52.960 "driver_specific": { 00:17:52.960 "passthru": { 00:17:52.960 "name": "pt1", 00:17:52.960 "base_bdev_name": "malloc1" 00:17:52.960 } 00:17:52.960 } 00:17:52.960 }' 00:17:52.960 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:52.960 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:52.960 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:52.960 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:52.960 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:52.960 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:52.960 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:52.960 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.219 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:53.219 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.219 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.219 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:53.219 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:53.219 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:53.219 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:53.478 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:53.478 "name": "pt2", 00:17:53.478 "aliases": [ 00:17:53.478 "00000000-0000-0000-0000-000000000002" 00:17:53.478 ], 00:17:53.478 "product_name": "passthru", 00:17:53.478 "block_size": 512, 00:17:53.478 "num_blocks": 65536, 00:17:53.478 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:53.478 "assigned_rate_limits": { 00:17:53.478 "rw_ios_per_sec": 0, 00:17:53.478 "rw_mbytes_per_sec": 0, 00:17:53.478 "r_mbytes_per_sec": 0, 00:17:53.478 "w_mbytes_per_sec": 0 00:17:53.478 }, 00:17:53.478 "claimed": true, 00:17:53.478 "claim_type": "exclusive_write", 00:17:53.478 "zoned": false, 00:17:53.478 "supported_io_types": { 00:17:53.478 "read": true, 00:17:53.478 "write": true, 00:17:53.478 "unmap": true, 00:17:53.478 "flush": true, 00:17:53.478 "reset": true, 00:17:53.478 "nvme_admin": false, 00:17:53.478 "nvme_io": false, 00:17:53.478 "nvme_io_md": false, 00:17:53.478 "write_zeroes": true, 00:17:53.478 "zcopy": true, 00:17:53.478 "get_zone_info": false, 00:17:53.478 "zone_management": false, 00:17:53.478 "zone_append": false, 00:17:53.478 "compare": false, 00:17:53.478 "compare_and_write": false, 00:17:53.478 "abort": true, 00:17:53.478 "seek_hole": false, 00:17:53.478 "seek_data": false, 00:17:53.478 "copy": true, 00:17:53.478 "nvme_iov_md": false 00:17:53.478 }, 00:17:53.478 "memory_domains": [ 00:17:53.478 { 00:17:53.478 "dma_device_id": "system", 00:17:53.478 "dma_device_type": 1 00:17:53.478 }, 00:17:53.478 { 00:17:53.478 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.478 "dma_device_type": 2 00:17:53.478 } 00:17:53.478 ], 00:17:53.478 "driver_specific": { 00:17:53.478 "passthru": { 00:17:53.478 "name": "pt2", 00:17:53.478 "base_bdev_name": "malloc2" 00:17:53.478 } 00:17:53.478 } 00:17:53.478 }' 00:17:53.478 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.478 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.478 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:53.478 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.771 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.771 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:53.771 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.771 20:31:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.771 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:53.771 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.771 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.771 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:53.771 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:53.771 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:53.771 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:54.031 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:54.031 "name": "pt3", 00:17:54.031 "aliases": [ 00:17:54.031 "00000000-0000-0000-0000-000000000003" 00:17:54.031 ], 00:17:54.031 "product_name": "passthru", 00:17:54.031 "block_size": 512, 00:17:54.031 "num_blocks": 65536, 00:17:54.031 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:54.031 "assigned_rate_limits": { 00:17:54.031 "rw_ios_per_sec": 0, 00:17:54.031 "rw_mbytes_per_sec": 0, 00:17:54.031 "r_mbytes_per_sec": 0, 00:17:54.031 "w_mbytes_per_sec": 0 00:17:54.031 }, 00:17:54.031 "claimed": true, 00:17:54.031 "claim_type": "exclusive_write", 00:17:54.031 "zoned": false, 00:17:54.031 "supported_io_types": { 00:17:54.031 "read": true, 00:17:54.031 "write": true, 00:17:54.031 "unmap": true, 00:17:54.031 "flush": true, 00:17:54.031 "reset": true, 00:17:54.031 "nvme_admin": false, 00:17:54.031 "nvme_io": false, 00:17:54.031 "nvme_io_md": false, 00:17:54.031 "write_zeroes": true, 00:17:54.031 "zcopy": true, 00:17:54.031 "get_zone_info": false, 00:17:54.031 "zone_management": false, 00:17:54.031 "zone_append": false, 00:17:54.031 "compare": false, 00:17:54.031 "compare_and_write": false, 00:17:54.031 "abort": true, 00:17:54.031 "seek_hole": false, 00:17:54.031 "seek_data": false, 00:17:54.031 "copy": true, 00:17:54.031 "nvme_iov_md": false 00:17:54.031 }, 00:17:54.031 "memory_domains": [ 00:17:54.031 { 00:17:54.031 "dma_device_id": "system", 00:17:54.031 "dma_device_type": 1 00:17:54.031 }, 00:17:54.031 { 00:17:54.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.031 "dma_device_type": 2 00:17:54.031 } 00:17:54.031 ], 00:17:54.031 "driver_specific": { 00:17:54.031 "passthru": { 00:17:54.031 "name": "pt3", 00:17:54.031 "base_bdev_name": "malloc3" 00:17:54.031 } 00:17:54.031 } 00:17:54.031 }' 00:17:54.031 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.290 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.290 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:54.290 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.290 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.290 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:54.290 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.549 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.549 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:54.549 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.549 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.549 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:54.549 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:54.549 20:31:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:55.116 [2024-07-15 20:31:47.341397] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:55.116 20:31:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0aeb0bd9-2185-4440-9084-00c436447ae1 00:17:55.116 20:31:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 0aeb0bd9-2185-4440-9084-00c436447ae1 ']' 00:17:55.116 20:31:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:55.375 [2024-07-15 20:31:47.597798] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:55.375 [2024-07-15 20:31:47.597818] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:55.375 [2024-07-15 20:31:47.597865] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:55.375 [2024-07-15 20:31:47.597941] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:55.375 [2024-07-15 20:31:47.597954] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ef4ea0 name raid_bdev1, state offline 00:17:55.375 20:31:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.375 20:31:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:55.635 20:31:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:55.635 20:31:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:55.635 20:31:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:55.635 20:31:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:55.894 20:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:55.894 20:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:56.153 20:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:56.153 20:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:56.412 20:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:56.412 20:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:56.671 20:31:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:56.672 [2024-07-15 20:31:49.049705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:56.931 [2024-07-15 20:31:49.051058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:56.931 [2024-07-15 20:31:49.051100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:56.931 [2024-07-15 20:31:49.051145] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:56.931 [2024-07-15 20:31:49.051183] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:56.931 [2024-07-15 20:31:49.051206] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:56.931 [2024-07-15 20:31:49.051224] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:56.931 [2024-07-15 20:31:49.051234] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x209fff0 name raid_bdev1, state configuring 00:17:56.931 request: 00:17:56.931 { 00:17:56.931 "name": "raid_bdev1", 00:17:56.931 "raid_level": "raid1", 00:17:56.931 "base_bdevs": [ 00:17:56.931 "malloc1", 00:17:56.931 "malloc2", 00:17:56.931 "malloc3" 00:17:56.931 ], 00:17:56.931 "superblock": false, 00:17:56.931 "method": "bdev_raid_create", 00:17:56.931 "req_id": 1 00:17:56.931 } 00:17:56.931 Got JSON-RPC error response 00:17:56.931 response: 00:17:56.931 { 00:17:56.931 "code": -17, 00:17:56.931 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:56.931 } 00:17:56.931 20:31:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:56.931 20:31:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:56.931 20:31:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:56.931 20:31:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:56.931 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:56.931 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:57.190 [2024-07-15 20:31:49.526916] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:57.190 [2024-07-15 20:31:49.526966] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:57.190 [2024-07-15 20:31:49.526989] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1efc7a0 00:17:57.190 [2024-07-15 20:31:49.527002] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:57.190 [2024-07-15 20:31:49.528644] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:57.190 [2024-07-15 20:31:49.528671] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:57.190 [2024-07-15 20:31:49.528738] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:57.190 [2024-07-15 20:31:49.528765] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:57.190 pt1 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.190 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:57.759 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.759 "name": "raid_bdev1", 00:17:57.759 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:17:57.759 "strip_size_kb": 0, 00:17:57.759 "state": "configuring", 00:17:57.759 "raid_level": "raid1", 00:17:57.759 "superblock": true, 00:17:57.759 "num_base_bdevs": 3, 00:17:57.759 "num_base_bdevs_discovered": 1, 00:17:57.759 "num_base_bdevs_operational": 3, 00:17:57.759 "base_bdevs_list": [ 00:17:57.759 { 00:17:57.759 "name": "pt1", 00:17:57.759 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:57.759 "is_configured": true, 00:17:57.759 "data_offset": 2048, 00:17:57.759 "data_size": 63488 00:17:57.759 }, 00:17:57.759 { 00:17:57.759 "name": null, 00:17:57.759 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:57.759 "is_configured": false, 00:17:57.759 "data_offset": 2048, 00:17:57.759 "data_size": 63488 00:17:57.759 }, 00:17:57.759 { 00:17:57.759 "name": null, 00:17:57.759 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:57.759 "is_configured": false, 00:17:57.759 "data_offset": 2048, 00:17:57.759 "data_size": 63488 00:17:57.759 } 00:17:57.759 ] 00:17:57.759 }' 00:17:57.759 20:31:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.759 20:31:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.327 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:58.327 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:58.327 [2024-07-15 20:31:50.657939] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:58.327 [2024-07-15 20:31:50.657994] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:58.327 [2024-07-15 20:31:50.658013] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef3a10 00:17:58.327 [2024-07-15 20:31:50.658026] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:58.327 [2024-07-15 20:31:50.658366] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:58.327 [2024-07-15 20:31:50.658383] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:58.327 [2024-07-15 20:31:50.658448] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:58.327 [2024-07-15 20:31:50.658466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:58.327 pt2 00:17:58.327 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:58.587 [2024-07-15 20:31:50.898594] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.587 20:31:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:58.846 20:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.846 "name": "raid_bdev1", 00:17:58.846 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:17:58.846 "strip_size_kb": 0, 00:17:58.846 "state": "configuring", 00:17:58.846 "raid_level": "raid1", 00:17:58.846 "superblock": true, 00:17:58.846 "num_base_bdevs": 3, 00:17:58.846 "num_base_bdevs_discovered": 1, 00:17:58.846 "num_base_bdevs_operational": 3, 00:17:58.846 "base_bdevs_list": [ 00:17:58.846 { 00:17:58.846 "name": "pt1", 00:17:58.846 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:58.846 "is_configured": true, 00:17:58.846 "data_offset": 2048, 00:17:58.847 "data_size": 63488 00:17:58.847 }, 00:17:58.847 { 00:17:58.847 "name": null, 00:17:58.847 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:58.847 "is_configured": false, 00:17:58.847 "data_offset": 2048, 00:17:58.847 "data_size": 63488 00:17:58.847 }, 00:17:58.847 { 00:17:58.847 "name": null, 00:17:58.847 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:58.847 "is_configured": false, 00:17:58.847 "data_offset": 2048, 00:17:58.847 "data_size": 63488 00:17:58.847 } 00:17:58.847 ] 00:17:58.847 }' 00:17:58.847 20:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.847 20:31:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.784 20:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:59.784 20:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:59.784 20:31:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:59.784 [2024-07-15 20:31:52.033674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:59.784 [2024-07-15 20:31:52.033721] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:59.784 [2024-07-15 20:31:52.033743] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1efca10 00:17:59.784 [2024-07-15 20:31:52.033761] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:59.784 [2024-07-15 20:31:52.034104] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:59.784 [2024-07-15 20:31:52.034122] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:59.784 [2024-07-15 20:31:52.034183] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:59.784 [2024-07-15 20:31:52.034201] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:59.784 pt2 00:17:59.784 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:59.784 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:59.784 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:00.043 [2024-07-15 20:31:52.282337] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:00.043 [2024-07-15 20:31:52.282375] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:00.043 [2024-07-15 20:31:52.282391] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef36c0 00:18:00.043 [2024-07-15 20:31:52.282403] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:00.043 [2024-07-15 20:31:52.282692] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:00.043 [2024-07-15 20:31:52.282708] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:00.043 [2024-07-15 20:31:52.282760] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:00.043 [2024-07-15 20:31:52.282777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:00.043 [2024-07-15 20:31:52.282880] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2096c00 00:18:00.043 [2024-07-15 20:31:52.282891] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:00.043 [2024-07-15 20:31:52.283064] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ef6610 00:18:00.043 [2024-07-15 20:31:52.283192] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2096c00 00:18:00.043 [2024-07-15 20:31:52.283202] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2096c00 00:18:00.043 [2024-07-15 20:31:52.283298] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:00.043 pt3 00:18:00.043 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:00.043 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:00.043 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:00.043 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:00.043 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:00.043 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:00.044 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:00.044 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:00.044 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.044 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.044 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.044 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.044 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.044 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:00.302 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.302 "name": "raid_bdev1", 00:18:00.302 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:18:00.302 "strip_size_kb": 0, 00:18:00.302 "state": "online", 00:18:00.302 "raid_level": "raid1", 00:18:00.302 "superblock": true, 00:18:00.302 "num_base_bdevs": 3, 00:18:00.302 "num_base_bdevs_discovered": 3, 00:18:00.302 "num_base_bdevs_operational": 3, 00:18:00.302 "base_bdevs_list": [ 00:18:00.302 { 00:18:00.302 "name": "pt1", 00:18:00.302 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:00.302 "is_configured": true, 00:18:00.302 "data_offset": 2048, 00:18:00.302 "data_size": 63488 00:18:00.302 }, 00:18:00.302 { 00:18:00.302 "name": "pt2", 00:18:00.302 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:00.302 "is_configured": true, 00:18:00.302 "data_offset": 2048, 00:18:00.302 "data_size": 63488 00:18:00.302 }, 00:18:00.302 { 00:18:00.302 "name": "pt3", 00:18:00.302 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:00.303 "is_configured": true, 00:18:00.303 "data_offset": 2048, 00:18:00.303 "data_size": 63488 00:18:00.303 } 00:18:00.303 ] 00:18:00.303 }' 00:18:00.303 20:31:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.303 20:31:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.870 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:00.870 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:00.870 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:00.870 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:00.870 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:00.870 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:00.870 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:00.870 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:01.129 [2024-07-15 20:31:53.313351] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:01.129 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:01.129 "name": "raid_bdev1", 00:18:01.129 "aliases": [ 00:18:01.129 "0aeb0bd9-2185-4440-9084-00c436447ae1" 00:18:01.129 ], 00:18:01.129 "product_name": "Raid Volume", 00:18:01.129 "block_size": 512, 00:18:01.129 "num_blocks": 63488, 00:18:01.129 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:18:01.129 "assigned_rate_limits": { 00:18:01.129 "rw_ios_per_sec": 0, 00:18:01.129 "rw_mbytes_per_sec": 0, 00:18:01.129 "r_mbytes_per_sec": 0, 00:18:01.129 "w_mbytes_per_sec": 0 00:18:01.129 }, 00:18:01.129 "claimed": false, 00:18:01.129 "zoned": false, 00:18:01.129 "supported_io_types": { 00:18:01.129 "read": true, 00:18:01.129 "write": true, 00:18:01.129 "unmap": false, 00:18:01.129 "flush": false, 00:18:01.129 "reset": true, 00:18:01.129 "nvme_admin": false, 00:18:01.129 "nvme_io": false, 00:18:01.129 "nvme_io_md": false, 00:18:01.129 "write_zeroes": true, 00:18:01.129 "zcopy": false, 00:18:01.129 "get_zone_info": false, 00:18:01.129 "zone_management": false, 00:18:01.129 "zone_append": false, 00:18:01.129 "compare": false, 00:18:01.129 "compare_and_write": false, 00:18:01.129 "abort": false, 00:18:01.129 "seek_hole": false, 00:18:01.129 "seek_data": false, 00:18:01.129 "copy": false, 00:18:01.129 "nvme_iov_md": false 00:18:01.129 }, 00:18:01.129 "memory_domains": [ 00:18:01.129 { 00:18:01.129 "dma_device_id": "system", 00:18:01.129 "dma_device_type": 1 00:18:01.129 }, 00:18:01.129 { 00:18:01.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.129 "dma_device_type": 2 00:18:01.129 }, 00:18:01.129 { 00:18:01.129 "dma_device_id": "system", 00:18:01.129 "dma_device_type": 1 00:18:01.129 }, 00:18:01.129 { 00:18:01.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.129 "dma_device_type": 2 00:18:01.129 }, 00:18:01.129 { 00:18:01.129 "dma_device_id": "system", 00:18:01.129 "dma_device_type": 1 00:18:01.129 }, 00:18:01.129 { 00:18:01.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.129 "dma_device_type": 2 00:18:01.129 } 00:18:01.129 ], 00:18:01.129 "driver_specific": { 00:18:01.129 "raid": { 00:18:01.129 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:18:01.129 "strip_size_kb": 0, 00:18:01.129 "state": "online", 00:18:01.129 "raid_level": "raid1", 00:18:01.129 "superblock": true, 00:18:01.129 "num_base_bdevs": 3, 00:18:01.129 "num_base_bdevs_discovered": 3, 00:18:01.129 "num_base_bdevs_operational": 3, 00:18:01.129 "base_bdevs_list": [ 00:18:01.129 { 00:18:01.129 "name": "pt1", 00:18:01.129 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:01.129 "is_configured": true, 00:18:01.129 "data_offset": 2048, 00:18:01.129 "data_size": 63488 00:18:01.129 }, 00:18:01.129 { 00:18:01.129 "name": "pt2", 00:18:01.129 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:01.129 "is_configured": true, 00:18:01.129 "data_offset": 2048, 00:18:01.129 "data_size": 63488 00:18:01.129 }, 00:18:01.129 { 00:18:01.129 "name": "pt3", 00:18:01.129 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:01.129 "is_configured": true, 00:18:01.129 "data_offset": 2048, 00:18:01.129 "data_size": 63488 00:18:01.129 } 00:18:01.129 ] 00:18:01.129 } 00:18:01.129 } 00:18:01.129 }' 00:18:01.129 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:01.129 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:01.129 pt2 00:18:01.129 pt3' 00:18:01.129 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:01.129 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:01.129 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:01.388 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:01.388 "name": "pt1", 00:18:01.388 "aliases": [ 00:18:01.388 "00000000-0000-0000-0000-000000000001" 00:18:01.388 ], 00:18:01.388 "product_name": "passthru", 00:18:01.388 "block_size": 512, 00:18:01.388 "num_blocks": 65536, 00:18:01.388 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:01.388 "assigned_rate_limits": { 00:18:01.388 "rw_ios_per_sec": 0, 00:18:01.388 "rw_mbytes_per_sec": 0, 00:18:01.388 "r_mbytes_per_sec": 0, 00:18:01.388 "w_mbytes_per_sec": 0 00:18:01.388 }, 00:18:01.388 "claimed": true, 00:18:01.388 "claim_type": "exclusive_write", 00:18:01.388 "zoned": false, 00:18:01.388 "supported_io_types": { 00:18:01.388 "read": true, 00:18:01.388 "write": true, 00:18:01.388 "unmap": true, 00:18:01.388 "flush": true, 00:18:01.388 "reset": true, 00:18:01.388 "nvme_admin": false, 00:18:01.388 "nvme_io": false, 00:18:01.388 "nvme_io_md": false, 00:18:01.388 "write_zeroes": true, 00:18:01.388 "zcopy": true, 00:18:01.388 "get_zone_info": false, 00:18:01.388 "zone_management": false, 00:18:01.388 "zone_append": false, 00:18:01.388 "compare": false, 00:18:01.388 "compare_and_write": false, 00:18:01.388 "abort": true, 00:18:01.388 "seek_hole": false, 00:18:01.388 "seek_data": false, 00:18:01.388 "copy": true, 00:18:01.388 "nvme_iov_md": false 00:18:01.388 }, 00:18:01.388 "memory_domains": [ 00:18:01.388 { 00:18:01.388 "dma_device_id": "system", 00:18:01.388 "dma_device_type": 1 00:18:01.388 }, 00:18:01.388 { 00:18:01.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.388 "dma_device_type": 2 00:18:01.388 } 00:18:01.388 ], 00:18:01.388 "driver_specific": { 00:18:01.388 "passthru": { 00:18:01.388 "name": "pt1", 00:18:01.388 "base_bdev_name": "malloc1" 00:18:01.388 } 00:18:01.388 } 00:18:01.388 }' 00:18:01.388 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.388 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.388 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:01.388 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.647 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.647 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:01.647 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.648 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.648 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:01.648 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.648 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.648 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:01.648 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:01.648 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:01.648 20:31:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:01.907 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:01.907 "name": "pt2", 00:18:01.907 "aliases": [ 00:18:01.907 "00000000-0000-0000-0000-000000000002" 00:18:01.907 ], 00:18:01.907 "product_name": "passthru", 00:18:01.907 "block_size": 512, 00:18:01.907 "num_blocks": 65536, 00:18:01.907 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:01.907 "assigned_rate_limits": { 00:18:01.907 "rw_ios_per_sec": 0, 00:18:01.907 "rw_mbytes_per_sec": 0, 00:18:01.907 "r_mbytes_per_sec": 0, 00:18:01.907 "w_mbytes_per_sec": 0 00:18:01.907 }, 00:18:01.907 "claimed": true, 00:18:01.907 "claim_type": "exclusive_write", 00:18:01.907 "zoned": false, 00:18:01.907 "supported_io_types": { 00:18:01.907 "read": true, 00:18:01.907 "write": true, 00:18:01.907 "unmap": true, 00:18:01.907 "flush": true, 00:18:01.907 "reset": true, 00:18:01.907 "nvme_admin": false, 00:18:01.907 "nvme_io": false, 00:18:01.907 "nvme_io_md": false, 00:18:01.907 "write_zeroes": true, 00:18:01.907 "zcopy": true, 00:18:01.907 "get_zone_info": false, 00:18:01.907 "zone_management": false, 00:18:01.907 "zone_append": false, 00:18:01.907 "compare": false, 00:18:01.907 "compare_and_write": false, 00:18:01.907 "abort": true, 00:18:01.907 "seek_hole": false, 00:18:01.907 "seek_data": false, 00:18:01.907 "copy": true, 00:18:01.907 "nvme_iov_md": false 00:18:01.907 }, 00:18:01.907 "memory_domains": [ 00:18:01.907 { 00:18:01.907 "dma_device_id": "system", 00:18:01.907 "dma_device_type": 1 00:18:01.907 }, 00:18:01.907 { 00:18:01.907 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.907 "dma_device_type": 2 00:18:01.907 } 00:18:01.907 ], 00:18:01.907 "driver_specific": { 00:18:01.907 "passthru": { 00:18:01.907 "name": "pt2", 00:18:01.907 "base_bdev_name": "malloc2" 00:18:01.907 } 00:18:01.907 } 00:18:01.907 }' 00:18:01.907 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.166 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.166 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.166 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.166 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.166 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.166 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.166 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.424 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:02.424 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.424 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.424 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:02.424 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.424 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:02.424 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:02.681 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:02.681 "name": "pt3", 00:18:02.681 "aliases": [ 00:18:02.681 "00000000-0000-0000-0000-000000000003" 00:18:02.681 ], 00:18:02.681 "product_name": "passthru", 00:18:02.681 "block_size": 512, 00:18:02.681 "num_blocks": 65536, 00:18:02.681 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:02.681 "assigned_rate_limits": { 00:18:02.681 "rw_ios_per_sec": 0, 00:18:02.681 "rw_mbytes_per_sec": 0, 00:18:02.681 "r_mbytes_per_sec": 0, 00:18:02.681 "w_mbytes_per_sec": 0 00:18:02.681 }, 00:18:02.681 "claimed": true, 00:18:02.681 "claim_type": "exclusive_write", 00:18:02.681 "zoned": false, 00:18:02.681 "supported_io_types": { 00:18:02.681 "read": true, 00:18:02.681 "write": true, 00:18:02.681 "unmap": true, 00:18:02.681 "flush": true, 00:18:02.681 "reset": true, 00:18:02.681 "nvme_admin": false, 00:18:02.681 "nvme_io": false, 00:18:02.682 "nvme_io_md": false, 00:18:02.682 "write_zeroes": true, 00:18:02.682 "zcopy": true, 00:18:02.682 "get_zone_info": false, 00:18:02.682 "zone_management": false, 00:18:02.682 "zone_append": false, 00:18:02.682 "compare": false, 00:18:02.682 "compare_and_write": false, 00:18:02.682 "abort": true, 00:18:02.682 "seek_hole": false, 00:18:02.682 "seek_data": false, 00:18:02.682 "copy": true, 00:18:02.682 "nvme_iov_md": false 00:18:02.682 }, 00:18:02.682 "memory_domains": [ 00:18:02.682 { 00:18:02.682 "dma_device_id": "system", 00:18:02.682 "dma_device_type": 1 00:18:02.682 }, 00:18:02.682 { 00:18:02.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.682 "dma_device_type": 2 00:18:02.682 } 00:18:02.682 ], 00:18:02.682 "driver_specific": { 00:18:02.682 "passthru": { 00:18:02.682 "name": "pt3", 00:18:02.682 "base_bdev_name": "malloc3" 00:18:02.682 } 00:18:02.682 } 00:18:02.682 }' 00:18:02.682 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.682 20:31:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.682 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.682 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.938 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.938 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.938 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.938 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.938 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:02.938 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.938 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.195 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:03.195 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:03.195 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:03.452 [2024-07-15 20:31:55.579371] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:03.452 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 0aeb0bd9-2185-4440-9084-00c436447ae1 '!=' 0aeb0bd9-2185-4440-9084-00c436447ae1 ']' 00:18:03.452 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:18:03.452 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:03.452 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:03.453 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:03.453 [2024-07-15 20:31:55.823760] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.710 20:31:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:03.967 20:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.967 "name": "raid_bdev1", 00:18:03.967 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:18:03.967 "strip_size_kb": 0, 00:18:03.967 "state": "online", 00:18:03.967 "raid_level": "raid1", 00:18:03.967 "superblock": true, 00:18:03.967 "num_base_bdevs": 3, 00:18:03.967 "num_base_bdevs_discovered": 2, 00:18:03.967 "num_base_bdevs_operational": 2, 00:18:03.967 "base_bdevs_list": [ 00:18:03.967 { 00:18:03.967 "name": null, 00:18:03.967 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.967 "is_configured": false, 00:18:03.967 "data_offset": 2048, 00:18:03.967 "data_size": 63488 00:18:03.967 }, 00:18:03.967 { 00:18:03.967 "name": "pt2", 00:18:03.967 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:03.967 "is_configured": true, 00:18:03.967 "data_offset": 2048, 00:18:03.967 "data_size": 63488 00:18:03.967 }, 00:18:03.967 { 00:18:03.967 "name": "pt3", 00:18:03.967 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:03.967 "is_configured": true, 00:18:03.967 "data_offset": 2048, 00:18:03.967 "data_size": 63488 00:18:03.967 } 00:18:03.967 ] 00:18:03.967 }' 00:18:03.967 20:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.967 20:31:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.530 20:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:04.788 [2024-07-15 20:31:56.970769] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:04.788 [2024-07-15 20:31:56.970803] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:04.788 [2024-07-15 20:31:56.970855] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:04.788 [2024-07-15 20:31:56.970908] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:04.788 [2024-07-15 20:31:56.970920] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2096c00 name raid_bdev1, state offline 00:18:04.788 20:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.788 20:31:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:18:05.046 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:18:05.046 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:18:05.046 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:18:05.046 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:05.046 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:05.303 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:05.303 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:05.303 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:05.562 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:05.562 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:05.562 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:18:05.562 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:05.562 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:05.820 [2024-07-15 20:31:57.969397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:05.820 [2024-07-15 20:31:57.969443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:05.820 [2024-07-15 20:31:57.969460] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef4310 00:18:05.820 [2024-07-15 20:31:57.969472] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:05.820 [2024-07-15 20:31:57.971065] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:05.820 [2024-07-15 20:31:57.971093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:05.820 [2024-07-15 20:31:57.971158] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:05.820 [2024-07-15 20:31:57.971184] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:05.820 pt2 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.820 20:31:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:06.079 20:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.079 "name": "raid_bdev1", 00:18:06.079 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:18:06.079 "strip_size_kb": 0, 00:18:06.079 "state": "configuring", 00:18:06.079 "raid_level": "raid1", 00:18:06.079 "superblock": true, 00:18:06.079 "num_base_bdevs": 3, 00:18:06.079 "num_base_bdevs_discovered": 1, 00:18:06.079 "num_base_bdevs_operational": 2, 00:18:06.079 "base_bdevs_list": [ 00:18:06.079 { 00:18:06.079 "name": null, 00:18:06.079 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.079 "is_configured": false, 00:18:06.079 "data_offset": 2048, 00:18:06.079 "data_size": 63488 00:18:06.079 }, 00:18:06.079 { 00:18:06.079 "name": "pt2", 00:18:06.079 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:06.079 "is_configured": true, 00:18:06.079 "data_offset": 2048, 00:18:06.079 "data_size": 63488 00:18:06.079 }, 00:18:06.079 { 00:18:06.079 "name": null, 00:18:06.079 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:06.079 "is_configured": false, 00:18:06.079 "data_offset": 2048, 00:18:06.079 "data_size": 63488 00:18:06.079 } 00:18:06.079 ] 00:18:06.079 }' 00:18:06.079 20:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.079 20:31:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.646 20:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:18:06.646 20:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:06.646 20:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:18:06.646 20:31:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:06.941 [2024-07-15 20:31:59.080360] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:06.941 [2024-07-15 20:31:59.080410] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:06.941 [2024-07-15 20:31:59.080430] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef2ec0 00:18:06.941 [2024-07-15 20:31:59.080443] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:06.941 [2024-07-15 20:31:59.080772] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:06.941 [2024-07-15 20:31:59.080789] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:06.941 [2024-07-15 20:31:59.080850] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:06.941 [2024-07-15 20:31:59.080869] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:06.941 [2024-07-15 20:31:59.080975] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2094cc0 00:18:06.941 [2024-07-15 20:31:59.080987] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:06.941 [2024-07-15 20:31:59.081151] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20956d0 00:18:06.941 [2024-07-15 20:31:59.081279] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2094cc0 00:18:06.941 [2024-07-15 20:31:59.081289] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2094cc0 00:18:06.941 [2024-07-15 20:31:59.081385] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:06.941 pt3 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:06.941 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.207 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:07.207 "name": "raid_bdev1", 00:18:07.207 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:18:07.207 "strip_size_kb": 0, 00:18:07.207 "state": "online", 00:18:07.207 "raid_level": "raid1", 00:18:07.207 "superblock": true, 00:18:07.207 "num_base_bdevs": 3, 00:18:07.207 "num_base_bdevs_discovered": 2, 00:18:07.207 "num_base_bdevs_operational": 2, 00:18:07.207 "base_bdevs_list": [ 00:18:07.207 { 00:18:07.207 "name": null, 00:18:07.207 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.207 "is_configured": false, 00:18:07.207 "data_offset": 2048, 00:18:07.207 "data_size": 63488 00:18:07.207 }, 00:18:07.207 { 00:18:07.207 "name": "pt2", 00:18:07.207 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:07.207 "is_configured": true, 00:18:07.207 "data_offset": 2048, 00:18:07.207 "data_size": 63488 00:18:07.207 }, 00:18:07.207 { 00:18:07.207 "name": "pt3", 00:18:07.207 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:07.207 "is_configured": true, 00:18:07.207 "data_offset": 2048, 00:18:07.207 "data_size": 63488 00:18:07.207 } 00:18:07.207 ] 00:18:07.207 }' 00:18:07.207 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:07.207 20:31:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.773 20:31:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:08.031 [2024-07-15 20:32:00.195318] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:08.031 [2024-07-15 20:32:00.195345] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:08.031 [2024-07-15 20:32:00.195397] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:08.031 [2024-07-15 20:32:00.195450] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:08.031 [2024-07-15 20:32:00.195462] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2094cc0 name raid_bdev1, state offline 00:18:08.031 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.031 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:18:08.289 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:18:08.289 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:18:08.289 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:18:08.289 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:18:08.289 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:08.547 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:08.805 [2024-07-15 20:32:00.941317] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:08.805 [2024-07-15 20:32:00.941362] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.805 [2024-07-15 20:32:00.941379] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef2ec0 00:18:08.805 [2024-07-15 20:32:00.941392] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.805 [2024-07-15 20:32:00.942977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.805 [2024-07-15 20:32:00.943003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:08.805 [2024-07-15 20:32:00.943071] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:08.805 [2024-07-15 20:32:00.943096] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:08.805 [2024-07-15 20:32:00.943188] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:18:08.805 [2024-07-15 20:32:00.943201] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:08.805 [2024-07-15 20:32:00.943214] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2094f40 name raid_bdev1, state configuring 00:18:08.805 [2024-07-15 20:32:00.943237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:08.805 pt1 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.806 20:32:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:09.064 20:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.064 "name": "raid_bdev1", 00:18:09.064 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:18:09.064 "strip_size_kb": 0, 00:18:09.064 "state": "configuring", 00:18:09.064 "raid_level": "raid1", 00:18:09.064 "superblock": true, 00:18:09.064 "num_base_bdevs": 3, 00:18:09.064 "num_base_bdevs_discovered": 1, 00:18:09.064 "num_base_bdevs_operational": 2, 00:18:09.064 "base_bdevs_list": [ 00:18:09.064 { 00:18:09.064 "name": null, 00:18:09.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.064 "is_configured": false, 00:18:09.064 "data_offset": 2048, 00:18:09.064 "data_size": 63488 00:18:09.064 }, 00:18:09.064 { 00:18:09.064 "name": "pt2", 00:18:09.064 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:09.064 "is_configured": true, 00:18:09.064 "data_offset": 2048, 00:18:09.064 "data_size": 63488 00:18:09.064 }, 00:18:09.064 { 00:18:09.064 "name": null, 00:18:09.064 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:09.064 "is_configured": false, 00:18:09.064 "data_offset": 2048, 00:18:09.064 "data_size": 63488 00:18:09.064 } 00:18:09.064 ] 00:18:09.064 }' 00:18:09.064 20:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.064 20:32:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:09.631 20:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:18:09.631 20:32:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:09.889 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:18:09.889 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:10.148 [2024-07-15 20:32:02.273004] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:10.148 [2024-07-15 20:32:02.273054] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:10.148 [2024-07-15 20:32:02.273073] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef60c0 00:18:10.148 [2024-07-15 20:32:02.273091] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:10.148 [2024-07-15 20:32:02.273434] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:10.148 [2024-07-15 20:32:02.273452] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:10.148 [2024-07-15 20:32:02.273513] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:10.148 [2024-07-15 20:32:02.273531] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:10.148 [2024-07-15 20:32:02.273629] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ef6a40 00:18:10.148 [2024-07-15 20:32:02.273640] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:10.148 [2024-07-15 20:32:02.273804] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20956c0 00:18:10.148 [2024-07-15 20:32:02.273942] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ef6a40 00:18:10.148 [2024-07-15 20:32:02.273954] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ef6a40 00:18:10.148 [2024-07-15 20:32:02.274050] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:10.148 pt3 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.148 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:10.407 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.407 "name": "raid_bdev1", 00:18:10.407 "uuid": "0aeb0bd9-2185-4440-9084-00c436447ae1", 00:18:10.407 "strip_size_kb": 0, 00:18:10.407 "state": "online", 00:18:10.407 "raid_level": "raid1", 00:18:10.407 "superblock": true, 00:18:10.407 "num_base_bdevs": 3, 00:18:10.407 "num_base_bdevs_discovered": 2, 00:18:10.407 "num_base_bdevs_operational": 2, 00:18:10.407 "base_bdevs_list": [ 00:18:10.407 { 00:18:10.407 "name": null, 00:18:10.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.407 "is_configured": false, 00:18:10.407 "data_offset": 2048, 00:18:10.407 "data_size": 63488 00:18:10.407 }, 00:18:10.407 { 00:18:10.407 "name": "pt2", 00:18:10.407 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:10.407 "is_configured": true, 00:18:10.407 "data_offset": 2048, 00:18:10.407 "data_size": 63488 00:18:10.407 }, 00:18:10.407 { 00:18:10.407 "name": "pt3", 00:18:10.407 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:10.407 "is_configured": true, 00:18:10.407 "data_offset": 2048, 00:18:10.407 "data_size": 63488 00:18:10.407 } 00:18:10.407 ] 00:18:10.407 }' 00:18:10.407 20:32:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.407 20:32:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:10.974 20:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:18:10.974 20:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:11.232 20:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:18:11.232 20:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:11.232 20:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:18:11.491 [2024-07-15 20:32:03.628851] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 0aeb0bd9-2185-4440-9084-00c436447ae1 '!=' 0aeb0bd9-2185-4440-9084-00c436447ae1 ']' 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1407318 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1407318 ']' 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1407318 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1407318 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1407318' 00:18:11.491 killing process with pid 1407318 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1407318 00:18:11.491 [2024-07-15 20:32:03.715722] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:11.491 [2024-07-15 20:32:03.715775] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:11.491 [2024-07-15 20:32:03.715829] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:11.491 [2024-07-15 20:32:03.715841] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ef6a40 name raid_bdev1, state offline 00:18:11.491 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1407318 00:18:11.491 [2024-07-15 20:32:03.742857] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:11.751 20:32:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:11.751 00:18:11.751 real 0m22.895s 00:18:11.751 user 0m42.015s 00:18:11.751 sys 0m4.005s 00:18:11.751 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:11.751 20:32:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.751 ************************************ 00:18:11.751 END TEST raid_superblock_test 00:18:11.751 ************************************ 00:18:11.751 20:32:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:11.751 20:32:04 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:18:11.751 20:32:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:11.751 20:32:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:11.751 20:32:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:11.751 ************************************ 00:18:11.751 START TEST raid_read_error_test 00:18:11.751 ************************************ 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.q445xw98r2 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1410751 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1410751 /var/tmp/spdk-raid.sock 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1410751 ']' 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:11.751 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:11.751 20:32:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.751 [2024-07-15 20:32:04.121524] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:18:11.751 [2024-07-15 20:32:04.121590] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1410751 ] 00:18:12.010 [2024-07-15 20:32:04.248125] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:12.010 [2024-07-15 20:32:04.345347] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:12.269 [2024-07-15 20:32:04.407133] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:12.269 [2024-07-15 20:32:04.407169] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:12.837 20:32:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:12.837 20:32:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:12.837 20:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:12.837 20:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:13.096 BaseBdev1_malloc 00:18:13.096 20:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:13.354 true 00:18:13.354 20:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:13.613 [2024-07-15 20:32:05.786036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:13.613 [2024-07-15 20:32:05.786081] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:13.613 [2024-07-15 20:32:05.786102] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16500d0 00:18:13.613 [2024-07-15 20:32:05.786114] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:13.613 [2024-07-15 20:32:05.787900] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:13.613 [2024-07-15 20:32:05.787935] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:13.613 BaseBdev1 00:18:13.613 20:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:13.613 20:32:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:13.872 BaseBdev2_malloc 00:18:13.872 20:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:14.131 true 00:18:14.131 20:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:14.389 [2024-07-15 20:32:06.537856] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:14.389 [2024-07-15 20:32:06.537903] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:14.389 [2024-07-15 20:32:06.537933] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1654910 00:18:14.389 [2024-07-15 20:32:06.537946] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:14.389 [2024-07-15 20:32:06.539549] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:14.389 [2024-07-15 20:32:06.539578] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:14.389 BaseBdev2 00:18:14.389 20:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:14.389 20:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:14.648 BaseBdev3_malloc 00:18:14.648 20:32:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:14.907 true 00:18:14.907 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:14.907 [2024-07-15 20:32:07.273687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:14.907 [2024-07-15 20:32:07.273733] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:14.907 [2024-07-15 20:32:07.273755] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1656bd0 00:18:14.907 [2024-07-15 20:32:07.273768] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:14.907 [2024-07-15 20:32:07.275361] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:14.907 [2024-07-15 20:32:07.275389] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:14.907 BaseBdev3 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:15.167 [2024-07-15 20:32:07.518357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:15.167 [2024-07-15 20:32:07.519706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:15.167 [2024-07-15 20:32:07.519776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:15.167 [2024-07-15 20:32:07.520002] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1658280 00:18:15.167 [2024-07-15 20:32:07.520015] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:15.167 [2024-07-15 20:32:07.520218] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1657e20 00:18:15.167 [2024-07-15 20:32:07.520374] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1658280 00:18:15.167 [2024-07-15 20:32:07.520384] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1658280 00:18:15.167 [2024-07-15 20:32:07.520495] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.167 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:15.426 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.426 "name": "raid_bdev1", 00:18:15.426 "uuid": "f0a10d5c-8aed-46ce-bb46-043883db86fe", 00:18:15.426 "strip_size_kb": 0, 00:18:15.426 "state": "online", 00:18:15.426 "raid_level": "raid1", 00:18:15.426 "superblock": true, 00:18:15.426 "num_base_bdevs": 3, 00:18:15.426 "num_base_bdevs_discovered": 3, 00:18:15.426 "num_base_bdevs_operational": 3, 00:18:15.426 "base_bdevs_list": [ 00:18:15.426 { 00:18:15.426 "name": "BaseBdev1", 00:18:15.426 "uuid": "ec6c137e-4804-512a-a003-bc0478da82da", 00:18:15.426 "is_configured": true, 00:18:15.426 "data_offset": 2048, 00:18:15.426 "data_size": 63488 00:18:15.426 }, 00:18:15.426 { 00:18:15.426 "name": "BaseBdev2", 00:18:15.426 "uuid": "aa4e50e5-0b05-5c91-8ab5-9a172d0e336d", 00:18:15.426 "is_configured": true, 00:18:15.426 "data_offset": 2048, 00:18:15.426 "data_size": 63488 00:18:15.426 }, 00:18:15.426 { 00:18:15.426 "name": "BaseBdev3", 00:18:15.426 "uuid": "8dcda26e-4c32-5a60-9984-09da7e9c0b20", 00:18:15.426 "is_configured": true, 00:18:15.426 "data_offset": 2048, 00:18:15.426 "data_size": 63488 00:18:15.426 } 00:18:15.426 ] 00:18:15.426 }' 00:18:15.427 20:32:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.427 20:32:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:16.362 20:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:16.362 20:32:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:16.362 [2024-07-15 20:32:08.509283] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a5e00 00:18:17.298 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:17.557 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:17.557 "name": "raid_bdev1", 00:18:17.557 "uuid": "f0a10d5c-8aed-46ce-bb46-043883db86fe", 00:18:17.557 "strip_size_kb": 0, 00:18:17.557 "state": "online", 00:18:17.557 "raid_level": "raid1", 00:18:17.557 "superblock": true, 00:18:17.557 "num_base_bdevs": 3, 00:18:17.557 "num_base_bdevs_discovered": 3, 00:18:17.557 "num_base_bdevs_operational": 3, 00:18:17.557 "base_bdevs_list": [ 00:18:17.557 { 00:18:17.557 "name": "BaseBdev1", 00:18:17.557 "uuid": "ec6c137e-4804-512a-a003-bc0478da82da", 00:18:17.557 "is_configured": true, 00:18:17.557 "data_offset": 2048, 00:18:17.557 "data_size": 63488 00:18:17.557 }, 00:18:17.557 { 00:18:17.557 "name": "BaseBdev2", 00:18:17.557 "uuid": "aa4e50e5-0b05-5c91-8ab5-9a172d0e336d", 00:18:17.557 "is_configured": true, 00:18:17.557 "data_offset": 2048, 00:18:17.557 "data_size": 63488 00:18:17.557 }, 00:18:17.557 { 00:18:17.557 "name": "BaseBdev3", 00:18:17.557 "uuid": "8dcda26e-4c32-5a60-9984-09da7e9c0b20", 00:18:17.557 "is_configured": true, 00:18:17.557 "data_offset": 2048, 00:18:17.558 "data_size": 63488 00:18:17.558 } 00:18:17.558 ] 00:18:17.558 }' 00:18:17.558 20:32:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:17.558 20:32:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:18.512 [2024-07-15 20:32:10.764939] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:18.512 [2024-07-15 20:32:10.764976] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:18.512 [2024-07-15 20:32:10.768193] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:18.512 [2024-07-15 20:32:10.768227] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:18.512 [2024-07-15 20:32:10.768326] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:18.512 [2024-07-15 20:32:10.768338] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1658280 name raid_bdev1, state offline 00:18:18.512 0 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1410751 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1410751 ']' 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1410751 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1410751 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1410751' 00:18:18.512 killing process with pid 1410751 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1410751 00:18:18.512 [2024-07-15 20:32:10.848687] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:18.512 20:32:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1410751 00:18:18.512 [2024-07-15 20:32:10.869043] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:18.770 20:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:18.770 20:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.q445xw98r2 00:18:18.770 20:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:18.770 20:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:18.770 20:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:18.771 20:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:18.771 20:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:18.771 20:32:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:18.771 00:18:18.771 real 0m7.048s 00:18:18.771 user 0m11.188s 00:18:18.771 sys 0m1.207s 00:18:18.771 20:32:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:18.771 20:32:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.771 ************************************ 00:18:18.771 END TEST raid_read_error_test 00:18:18.771 ************************************ 00:18:18.771 20:32:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:18.771 20:32:11 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:18:18.771 20:32:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:18.771 20:32:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:18.771 20:32:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:19.030 ************************************ 00:18:19.030 START TEST raid_write_error_test 00:18:19.030 ************************************ 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Aa1NFmq0RE 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1411731 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1411731 /var/tmp/spdk-raid.sock 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1411731 ']' 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:19.030 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:19.030 20:32:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:19.030 [2024-07-15 20:32:11.263956] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:18:19.030 [2024-07-15 20:32:11.264046] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1411731 ] 00:18:19.030 [2024-07-15 20:32:11.406378] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:19.289 [2024-07-15 20:32:11.509138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:19.289 [2024-07-15 20:32:11.571624] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:19.289 [2024-07-15 20:32:11.571673] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:19.856 20:32:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:19.856 20:32:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:19.856 20:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:19.856 20:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:20.114 BaseBdev1_malloc 00:18:20.114 20:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:20.391 true 00:18:20.391 20:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:20.649 [2024-07-15 20:32:12.909476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:20.649 [2024-07-15 20:32:12.909521] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:20.649 [2024-07-15 20:32:12.909542] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26e20d0 00:18:20.649 [2024-07-15 20:32:12.909554] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:20.649 [2024-07-15 20:32:12.911481] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:20.649 [2024-07-15 20:32:12.911511] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:20.649 BaseBdev1 00:18:20.650 20:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:20.650 20:32:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:20.908 BaseBdev2_malloc 00:18:20.908 20:32:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:21.167 true 00:18:21.167 20:32:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:21.444 [2024-07-15 20:32:13.648043] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:21.444 [2024-07-15 20:32:13.648087] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:21.444 [2024-07-15 20:32:13.648110] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26e6910 00:18:21.444 [2024-07-15 20:32:13.648122] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:21.444 [2024-07-15 20:32:13.649751] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:21.444 [2024-07-15 20:32:13.649779] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:21.444 BaseBdev2 00:18:21.444 20:32:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:21.444 20:32:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:21.717 BaseBdev3_malloc 00:18:21.717 20:32:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:21.976 true 00:18:21.976 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:22.234 [2024-07-15 20:32:14.379827] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:22.234 [2024-07-15 20:32:14.379872] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:22.234 [2024-07-15 20:32:14.379894] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26e8bd0 00:18:22.234 [2024-07-15 20:32:14.379907] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:22.234 [2024-07-15 20:32:14.381532] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:22.234 [2024-07-15 20:32:14.381559] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:22.234 BaseBdev3 00:18:22.235 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:22.493 [2024-07-15 20:32:14.624553] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:22.493 [2024-07-15 20:32:14.625924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:22.493 [2024-07-15 20:32:14.626005] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:22.493 [2024-07-15 20:32:14.626221] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26ea280 00:18:22.493 [2024-07-15 20:32:14.626233] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:22.493 [2024-07-15 20:32:14.626431] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26e9e20 00:18:22.493 [2024-07-15 20:32:14.626585] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26ea280 00:18:22.493 [2024-07-15 20:32:14.626595] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26ea280 00:18:22.493 [2024-07-15 20:32:14.626703] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.493 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:22.752 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.752 "name": "raid_bdev1", 00:18:22.752 "uuid": "0f4ac7ea-4253-497d-8dea-15de7022735a", 00:18:22.752 "strip_size_kb": 0, 00:18:22.752 "state": "online", 00:18:22.752 "raid_level": "raid1", 00:18:22.752 "superblock": true, 00:18:22.752 "num_base_bdevs": 3, 00:18:22.752 "num_base_bdevs_discovered": 3, 00:18:22.752 "num_base_bdevs_operational": 3, 00:18:22.752 "base_bdevs_list": [ 00:18:22.752 { 00:18:22.752 "name": "BaseBdev1", 00:18:22.752 "uuid": "48c57fba-1f16-5f34-b0cd-a3c2f79c2613", 00:18:22.752 "is_configured": true, 00:18:22.752 "data_offset": 2048, 00:18:22.752 "data_size": 63488 00:18:22.752 }, 00:18:22.752 { 00:18:22.752 "name": "BaseBdev2", 00:18:22.753 "uuid": "b7c86f7a-ed88-5ec0-af1b-c372b8747e8e", 00:18:22.753 "is_configured": true, 00:18:22.753 "data_offset": 2048, 00:18:22.753 "data_size": 63488 00:18:22.753 }, 00:18:22.753 { 00:18:22.753 "name": "BaseBdev3", 00:18:22.753 "uuid": "10569cc2-91dc-5db0-9d50-0bec29a3f4e0", 00:18:22.753 "is_configured": true, 00:18:22.753 "data_offset": 2048, 00:18:22.753 "data_size": 63488 00:18:22.753 } 00:18:22.753 ] 00:18:22.753 }' 00:18:22.753 20:32:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.753 20:32:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.320 20:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:23.320 20:32:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:23.320 [2024-07-15 20:32:15.587385] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2537e00 00:18:24.258 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:24.517 [2024-07-15 20:32:16.671260] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:18:24.517 [2024-07-15 20:32:16.671319] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:24.517 [2024-07-15 20:32:16.671514] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2537e00 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:24.517 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.517 "name": "raid_bdev1", 00:18:24.517 "uuid": "0f4ac7ea-4253-497d-8dea-15de7022735a", 00:18:24.517 "strip_size_kb": 0, 00:18:24.517 "state": "online", 00:18:24.517 "raid_level": "raid1", 00:18:24.517 "superblock": true, 00:18:24.517 "num_base_bdevs": 3, 00:18:24.517 "num_base_bdevs_discovered": 2, 00:18:24.517 "num_base_bdevs_operational": 2, 00:18:24.517 "base_bdevs_list": [ 00:18:24.517 { 00:18:24.517 "name": null, 00:18:24.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.517 "is_configured": false, 00:18:24.517 "data_offset": 2048, 00:18:24.517 "data_size": 63488 00:18:24.517 }, 00:18:24.517 { 00:18:24.517 "name": "BaseBdev2", 00:18:24.517 "uuid": "b7c86f7a-ed88-5ec0-af1b-c372b8747e8e", 00:18:24.517 "is_configured": true, 00:18:24.517 "data_offset": 2048, 00:18:24.517 "data_size": 63488 00:18:24.517 }, 00:18:24.517 { 00:18:24.517 "name": "BaseBdev3", 00:18:24.517 "uuid": "10569cc2-91dc-5db0-9d50-0bec29a3f4e0", 00:18:24.517 "is_configured": true, 00:18:24.517 "data_offset": 2048, 00:18:24.517 "data_size": 63488 00:18:24.517 } 00:18:24.517 ] 00:18:24.517 }' 00:18:24.518 20:32:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.518 20:32:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:25.455 [2024-07-15 20:32:17.644975] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:25.455 [2024-07-15 20:32:17.645015] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:25.455 [2024-07-15 20:32:17.648146] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:25.455 [2024-07-15 20:32:17.648177] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:25.455 [2024-07-15 20:32:17.648251] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:25.455 [2024-07-15 20:32:17.648263] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26ea280 name raid_bdev1, state offline 00:18:25.455 0 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1411731 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1411731 ']' 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1411731 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1411731 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1411731' 00:18:25.455 killing process with pid 1411731 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1411731 00:18:25.455 [2024-07-15 20:32:17.729034] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:25.455 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1411731 00:18:25.455 [2024-07-15 20:32:17.749866] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:25.715 20:32:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Aa1NFmq0RE 00:18:25.715 20:32:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:25.715 20:32:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:25.715 20:32:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:25.715 20:32:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:25.715 20:32:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:25.715 20:32:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:25.715 20:32:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:25.715 00:18:25.715 real 0m6.798s 00:18:25.715 user 0m10.665s 00:18:25.715 sys 0m1.226s 00:18:25.715 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:25.715 20:32:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.715 ************************************ 00:18:25.715 END TEST raid_write_error_test 00:18:25.715 ************************************ 00:18:25.715 20:32:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:25.715 20:32:18 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:18:25.715 20:32:18 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:18:25.715 20:32:18 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:18:25.715 20:32:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:25.715 20:32:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:25.715 20:32:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:25.715 ************************************ 00:18:25.715 START TEST raid_state_function_test 00:18:25.715 ************************************ 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1412708 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1412708' 00:18:25.715 Process raid pid: 1412708 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1412708 /var/tmp/spdk-raid.sock 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1412708 ']' 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:25.715 20:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:25.716 20:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:25.716 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:25.716 20:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:25.716 20:32:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.975 [2024-07-15 20:32:18.148236] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:18:25.975 [2024-07-15 20:32:18.148303] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:25.975 [2024-07-15 20:32:18.277032] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:26.233 [2024-07-15 20:32:18.375329] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:26.233 [2024-07-15 20:32:18.434377] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:26.233 [2024-07-15 20:32:18.434411] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:27.168 20:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:27.168 20:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:18:27.168 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:27.427 [2024-07-15 20:32:19.571859] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:27.427 [2024-07-15 20:32:19.571902] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:27.427 [2024-07-15 20:32:19.571913] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:27.427 [2024-07-15 20:32:19.571931] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:27.427 [2024-07-15 20:32:19.571940] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:27.427 [2024-07-15 20:32:19.571951] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:27.427 [2024-07-15 20:32:19.571960] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:27.427 [2024-07-15 20:32:19.571971] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:27.427 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:27.427 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.427 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:27.427 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:27.427 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.428 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:27.428 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.428 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.428 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.428 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.428 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.428 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.687 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.687 "name": "Existed_Raid", 00:18:27.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.687 "strip_size_kb": 64, 00:18:27.687 "state": "configuring", 00:18:27.687 "raid_level": "raid0", 00:18:27.687 "superblock": false, 00:18:27.687 "num_base_bdevs": 4, 00:18:27.687 "num_base_bdevs_discovered": 0, 00:18:27.687 "num_base_bdevs_operational": 4, 00:18:27.687 "base_bdevs_list": [ 00:18:27.687 { 00:18:27.687 "name": "BaseBdev1", 00:18:27.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.687 "is_configured": false, 00:18:27.687 "data_offset": 0, 00:18:27.687 "data_size": 0 00:18:27.687 }, 00:18:27.687 { 00:18:27.687 "name": "BaseBdev2", 00:18:27.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.687 "is_configured": false, 00:18:27.687 "data_offset": 0, 00:18:27.687 "data_size": 0 00:18:27.687 }, 00:18:27.687 { 00:18:27.687 "name": "BaseBdev3", 00:18:27.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.687 "is_configured": false, 00:18:27.687 "data_offset": 0, 00:18:27.687 "data_size": 0 00:18:27.687 }, 00:18:27.687 { 00:18:27.687 "name": "BaseBdev4", 00:18:27.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.687 "is_configured": false, 00:18:27.687 "data_offset": 0, 00:18:27.687 "data_size": 0 00:18:27.687 } 00:18:27.687 ] 00:18:27.687 }' 00:18:27.687 20:32:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.687 20:32:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.256 20:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:28.515 [2024-07-15 20:32:20.646569] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:28.515 [2024-07-15 20:32:20.646602] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xba2aa0 name Existed_Raid, state configuring 00:18:28.515 20:32:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:28.774 [2024-07-15 20:32:21.143890] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:28.774 [2024-07-15 20:32:21.143923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:28.775 [2024-07-15 20:32:21.143938] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:28.775 [2024-07-15 20:32:21.143950] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:28.775 [2024-07-15 20:32:21.143959] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:28.775 [2024-07-15 20:32:21.143970] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:28.775 [2024-07-15 20:32:21.143978] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:28.775 [2024-07-15 20:32:21.143997] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:29.034 20:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:29.034 [2024-07-15 20:32:21.411592] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:29.034 BaseBdev1 00:18:29.294 20:32:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:29.294 20:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:29.294 20:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:29.294 20:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:29.294 20:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:29.294 20:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:29.294 20:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:29.862 20:32:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:29.862 [ 00:18:29.862 { 00:18:29.862 "name": "BaseBdev1", 00:18:29.862 "aliases": [ 00:18:29.862 "75fd06ad-d3fb-4226-8cd6-bc87e4b4645d" 00:18:29.862 ], 00:18:29.862 "product_name": "Malloc disk", 00:18:29.862 "block_size": 512, 00:18:29.862 "num_blocks": 65536, 00:18:29.862 "uuid": "75fd06ad-d3fb-4226-8cd6-bc87e4b4645d", 00:18:29.862 "assigned_rate_limits": { 00:18:29.862 "rw_ios_per_sec": 0, 00:18:29.862 "rw_mbytes_per_sec": 0, 00:18:29.862 "r_mbytes_per_sec": 0, 00:18:29.862 "w_mbytes_per_sec": 0 00:18:29.862 }, 00:18:29.862 "claimed": true, 00:18:29.862 "claim_type": "exclusive_write", 00:18:29.862 "zoned": false, 00:18:29.862 "supported_io_types": { 00:18:29.862 "read": true, 00:18:29.862 "write": true, 00:18:29.862 "unmap": true, 00:18:29.862 "flush": true, 00:18:29.862 "reset": true, 00:18:29.862 "nvme_admin": false, 00:18:29.862 "nvme_io": false, 00:18:29.862 "nvme_io_md": false, 00:18:29.862 "write_zeroes": true, 00:18:29.862 "zcopy": true, 00:18:29.862 "get_zone_info": false, 00:18:29.862 "zone_management": false, 00:18:29.862 "zone_append": false, 00:18:29.862 "compare": false, 00:18:29.862 "compare_and_write": false, 00:18:29.862 "abort": true, 00:18:29.862 "seek_hole": false, 00:18:29.862 "seek_data": false, 00:18:29.862 "copy": true, 00:18:29.862 "nvme_iov_md": false 00:18:29.862 }, 00:18:29.862 "memory_domains": [ 00:18:29.862 { 00:18:29.862 "dma_device_id": "system", 00:18:29.862 "dma_device_type": 1 00:18:29.862 }, 00:18:29.862 { 00:18:29.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.862 "dma_device_type": 2 00:18:29.862 } 00:18:29.862 ], 00:18:29.862 "driver_specific": {} 00:18:29.862 } 00:18:29.862 ] 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.862 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.122 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.122 "name": "Existed_Raid", 00:18:30.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.122 "strip_size_kb": 64, 00:18:30.122 "state": "configuring", 00:18:30.122 "raid_level": "raid0", 00:18:30.122 "superblock": false, 00:18:30.122 "num_base_bdevs": 4, 00:18:30.122 "num_base_bdevs_discovered": 1, 00:18:30.122 "num_base_bdevs_operational": 4, 00:18:30.122 "base_bdevs_list": [ 00:18:30.122 { 00:18:30.122 "name": "BaseBdev1", 00:18:30.122 "uuid": "75fd06ad-d3fb-4226-8cd6-bc87e4b4645d", 00:18:30.122 "is_configured": true, 00:18:30.122 "data_offset": 0, 00:18:30.122 "data_size": 65536 00:18:30.122 }, 00:18:30.122 { 00:18:30.122 "name": "BaseBdev2", 00:18:30.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.122 "is_configured": false, 00:18:30.122 "data_offset": 0, 00:18:30.122 "data_size": 0 00:18:30.122 }, 00:18:30.122 { 00:18:30.122 "name": "BaseBdev3", 00:18:30.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.122 "is_configured": false, 00:18:30.122 "data_offset": 0, 00:18:30.122 "data_size": 0 00:18:30.122 }, 00:18:30.122 { 00:18:30.122 "name": "BaseBdev4", 00:18:30.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.122 "is_configured": false, 00:18:30.122 "data_offset": 0, 00:18:30.122 "data_size": 0 00:18:30.122 } 00:18:30.122 ] 00:18:30.122 }' 00:18:30.122 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.122 20:32:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:30.690 20:32:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:31.257 [2024-07-15 20:32:23.477086] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:31.257 [2024-07-15 20:32:23.477128] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xba2310 name Existed_Raid, state configuring 00:18:31.257 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:31.516 [2024-07-15 20:32:23.733796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:31.516 [2024-07-15 20:32:23.735289] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:31.516 [2024-07-15 20:32:23.735322] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:31.516 [2024-07-15 20:32:23.735332] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:31.516 [2024-07-15 20:32:23.735344] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:31.516 [2024-07-15 20:32:23.735353] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:31.516 [2024-07-15 20:32:23.735364] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.516 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:31.775 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:31.775 "name": "Existed_Raid", 00:18:31.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:31.775 "strip_size_kb": 64, 00:18:31.775 "state": "configuring", 00:18:31.775 "raid_level": "raid0", 00:18:31.775 "superblock": false, 00:18:31.775 "num_base_bdevs": 4, 00:18:31.775 "num_base_bdevs_discovered": 1, 00:18:31.775 "num_base_bdevs_operational": 4, 00:18:31.775 "base_bdevs_list": [ 00:18:31.775 { 00:18:31.775 "name": "BaseBdev1", 00:18:31.775 "uuid": "75fd06ad-d3fb-4226-8cd6-bc87e4b4645d", 00:18:31.775 "is_configured": true, 00:18:31.775 "data_offset": 0, 00:18:31.775 "data_size": 65536 00:18:31.775 }, 00:18:31.775 { 00:18:31.775 "name": "BaseBdev2", 00:18:31.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:31.775 "is_configured": false, 00:18:31.775 "data_offset": 0, 00:18:31.775 "data_size": 0 00:18:31.775 }, 00:18:31.775 { 00:18:31.775 "name": "BaseBdev3", 00:18:31.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:31.775 "is_configured": false, 00:18:31.775 "data_offset": 0, 00:18:31.775 "data_size": 0 00:18:31.775 }, 00:18:31.775 { 00:18:31.775 "name": "BaseBdev4", 00:18:31.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:31.775 "is_configured": false, 00:18:31.775 "data_offset": 0, 00:18:31.775 "data_size": 0 00:18:31.775 } 00:18:31.775 ] 00:18:31.775 }' 00:18:31.775 20:32:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:31.775 20:32:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:32.343 20:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:32.602 [2024-07-15 20:32:24.731878] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:32.602 BaseBdev2 00:18:32.602 20:32:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:32.602 20:32:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:32.602 20:32:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:32.602 20:32:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:32.602 20:32:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:32.602 20:32:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:32.602 20:32:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:32.861 20:32:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:32.861 [ 00:18:32.861 { 00:18:32.861 "name": "BaseBdev2", 00:18:32.861 "aliases": [ 00:18:32.861 "366be763-61e7-44dc-9006-1df9314e3880" 00:18:32.861 ], 00:18:32.861 "product_name": "Malloc disk", 00:18:32.861 "block_size": 512, 00:18:32.861 "num_blocks": 65536, 00:18:32.861 "uuid": "366be763-61e7-44dc-9006-1df9314e3880", 00:18:32.861 "assigned_rate_limits": { 00:18:32.861 "rw_ios_per_sec": 0, 00:18:32.861 "rw_mbytes_per_sec": 0, 00:18:32.861 "r_mbytes_per_sec": 0, 00:18:32.861 "w_mbytes_per_sec": 0 00:18:32.861 }, 00:18:32.861 "claimed": true, 00:18:32.861 "claim_type": "exclusive_write", 00:18:32.861 "zoned": false, 00:18:32.861 "supported_io_types": { 00:18:32.861 "read": true, 00:18:32.861 "write": true, 00:18:32.861 "unmap": true, 00:18:32.861 "flush": true, 00:18:32.861 "reset": true, 00:18:32.861 "nvme_admin": false, 00:18:32.861 "nvme_io": false, 00:18:32.861 "nvme_io_md": false, 00:18:32.861 "write_zeroes": true, 00:18:32.861 "zcopy": true, 00:18:32.861 "get_zone_info": false, 00:18:32.861 "zone_management": false, 00:18:32.861 "zone_append": false, 00:18:32.861 "compare": false, 00:18:32.861 "compare_and_write": false, 00:18:32.861 "abort": true, 00:18:32.861 "seek_hole": false, 00:18:32.861 "seek_data": false, 00:18:32.861 "copy": true, 00:18:32.861 "nvme_iov_md": false 00:18:32.861 }, 00:18:32.861 "memory_domains": [ 00:18:32.861 { 00:18:32.861 "dma_device_id": "system", 00:18:32.861 "dma_device_type": 1 00:18:32.861 }, 00:18:32.862 { 00:18:32.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.862 "dma_device_type": 2 00:18:32.862 } 00:18:32.862 ], 00:18:32.862 "driver_specific": {} 00:18:32.862 } 00:18:32.862 ] 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.120 "name": "Existed_Raid", 00:18:33.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.120 "strip_size_kb": 64, 00:18:33.120 "state": "configuring", 00:18:33.120 "raid_level": "raid0", 00:18:33.120 "superblock": false, 00:18:33.120 "num_base_bdevs": 4, 00:18:33.120 "num_base_bdevs_discovered": 2, 00:18:33.120 "num_base_bdevs_operational": 4, 00:18:33.120 "base_bdevs_list": [ 00:18:33.120 { 00:18:33.120 "name": "BaseBdev1", 00:18:33.120 "uuid": "75fd06ad-d3fb-4226-8cd6-bc87e4b4645d", 00:18:33.120 "is_configured": true, 00:18:33.120 "data_offset": 0, 00:18:33.120 "data_size": 65536 00:18:33.120 }, 00:18:33.120 { 00:18:33.120 "name": "BaseBdev2", 00:18:33.120 "uuid": "366be763-61e7-44dc-9006-1df9314e3880", 00:18:33.120 "is_configured": true, 00:18:33.120 "data_offset": 0, 00:18:33.120 "data_size": 65536 00:18:33.120 }, 00:18:33.120 { 00:18:33.120 "name": "BaseBdev3", 00:18:33.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.120 "is_configured": false, 00:18:33.120 "data_offset": 0, 00:18:33.120 "data_size": 0 00:18:33.120 }, 00:18:33.120 { 00:18:33.120 "name": "BaseBdev4", 00:18:33.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.120 "is_configured": false, 00:18:33.120 "data_offset": 0, 00:18:33.120 "data_size": 0 00:18:33.120 } 00:18:33.120 ] 00:18:33.120 }' 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.120 20:32:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:34.058 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:34.058 [2024-07-15 20:32:26.247266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:34.058 BaseBdev3 00:18:34.058 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:34.058 20:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:34.058 20:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:34.058 20:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:34.058 20:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:34.058 20:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:34.058 20:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:34.316 20:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:34.574 [ 00:18:34.574 { 00:18:34.574 "name": "BaseBdev3", 00:18:34.574 "aliases": [ 00:18:34.574 "21209c3d-a3ba-4654-913c-9ebd970e158f" 00:18:34.574 ], 00:18:34.574 "product_name": "Malloc disk", 00:18:34.574 "block_size": 512, 00:18:34.574 "num_blocks": 65536, 00:18:34.574 "uuid": "21209c3d-a3ba-4654-913c-9ebd970e158f", 00:18:34.575 "assigned_rate_limits": { 00:18:34.575 "rw_ios_per_sec": 0, 00:18:34.575 "rw_mbytes_per_sec": 0, 00:18:34.575 "r_mbytes_per_sec": 0, 00:18:34.575 "w_mbytes_per_sec": 0 00:18:34.575 }, 00:18:34.575 "claimed": true, 00:18:34.575 "claim_type": "exclusive_write", 00:18:34.575 "zoned": false, 00:18:34.575 "supported_io_types": { 00:18:34.575 "read": true, 00:18:34.575 "write": true, 00:18:34.575 "unmap": true, 00:18:34.575 "flush": true, 00:18:34.575 "reset": true, 00:18:34.575 "nvme_admin": false, 00:18:34.575 "nvme_io": false, 00:18:34.575 "nvme_io_md": false, 00:18:34.575 "write_zeroes": true, 00:18:34.575 "zcopy": true, 00:18:34.575 "get_zone_info": false, 00:18:34.575 "zone_management": false, 00:18:34.575 "zone_append": false, 00:18:34.575 "compare": false, 00:18:34.575 "compare_and_write": false, 00:18:34.575 "abort": true, 00:18:34.575 "seek_hole": false, 00:18:34.575 "seek_data": false, 00:18:34.575 "copy": true, 00:18:34.575 "nvme_iov_md": false 00:18:34.575 }, 00:18:34.575 "memory_domains": [ 00:18:34.575 { 00:18:34.575 "dma_device_id": "system", 00:18:34.575 "dma_device_type": 1 00:18:34.575 }, 00:18:34.575 { 00:18:34.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.575 "dma_device_type": 2 00:18:34.575 } 00:18:34.575 ], 00:18:34.575 "driver_specific": {} 00:18:34.575 } 00:18:34.575 ] 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.575 20:32:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:34.833 20:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:34.833 "name": "Existed_Raid", 00:18:34.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:34.833 "strip_size_kb": 64, 00:18:34.833 "state": "configuring", 00:18:34.833 "raid_level": "raid0", 00:18:34.833 "superblock": false, 00:18:34.833 "num_base_bdevs": 4, 00:18:34.833 "num_base_bdevs_discovered": 3, 00:18:34.833 "num_base_bdevs_operational": 4, 00:18:34.833 "base_bdevs_list": [ 00:18:34.833 { 00:18:34.833 "name": "BaseBdev1", 00:18:34.833 "uuid": "75fd06ad-d3fb-4226-8cd6-bc87e4b4645d", 00:18:34.833 "is_configured": true, 00:18:34.833 "data_offset": 0, 00:18:34.833 "data_size": 65536 00:18:34.833 }, 00:18:34.833 { 00:18:34.833 "name": "BaseBdev2", 00:18:34.833 "uuid": "366be763-61e7-44dc-9006-1df9314e3880", 00:18:34.833 "is_configured": true, 00:18:34.833 "data_offset": 0, 00:18:34.833 "data_size": 65536 00:18:34.833 }, 00:18:34.833 { 00:18:34.833 "name": "BaseBdev3", 00:18:34.833 "uuid": "21209c3d-a3ba-4654-913c-9ebd970e158f", 00:18:34.833 "is_configured": true, 00:18:34.833 "data_offset": 0, 00:18:34.833 "data_size": 65536 00:18:34.833 }, 00:18:34.833 { 00:18:34.833 "name": "BaseBdev4", 00:18:34.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:34.833 "is_configured": false, 00:18:34.833 "data_offset": 0, 00:18:34.833 "data_size": 0 00:18:34.833 } 00:18:34.833 ] 00:18:34.833 }' 00:18:34.833 20:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:34.833 20:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:35.400 20:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:35.400 [2024-07-15 20:32:27.762730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:35.400 [2024-07-15 20:32:27.762767] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xba3350 00:18:35.400 [2024-07-15 20:32:27.762775] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:35.400 [2024-07-15 20:32:27.763031] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xba3020 00:18:35.400 [2024-07-15 20:32:27.763150] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xba3350 00:18:35.400 [2024-07-15 20:32:27.763160] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xba3350 00:18:35.400 [2024-07-15 20:32:27.763321] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:35.400 BaseBdev4 00:18:35.659 20:32:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:35.659 20:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:35.659 20:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:35.659 20:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:35.659 20:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:35.659 20:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:35.659 20:32:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:35.920 20:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:35.920 [ 00:18:35.920 { 00:18:35.920 "name": "BaseBdev4", 00:18:35.920 "aliases": [ 00:18:35.920 "491bd7b7-dd6e-4921-8727-b36c81b1774d" 00:18:35.920 ], 00:18:35.920 "product_name": "Malloc disk", 00:18:35.920 "block_size": 512, 00:18:35.920 "num_blocks": 65536, 00:18:35.920 "uuid": "491bd7b7-dd6e-4921-8727-b36c81b1774d", 00:18:35.920 "assigned_rate_limits": { 00:18:35.920 "rw_ios_per_sec": 0, 00:18:35.920 "rw_mbytes_per_sec": 0, 00:18:35.920 "r_mbytes_per_sec": 0, 00:18:35.920 "w_mbytes_per_sec": 0 00:18:35.920 }, 00:18:35.920 "claimed": true, 00:18:35.920 "claim_type": "exclusive_write", 00:18:35.920 "zoned": false, 00:18:35.920 "supported_io_types": { 00:18:35.920 "read": true, 00:18:35.920 "write": true, 00:18:35.920 "unmap": true, 00:18:35.920 "flush": true, 00:18:35.920 "reset": true, 00:18:35.920 "nvme_admin": false, 00:18:35.920 "nvme_io": false, 00:18:35.920 "nvme_io_md": false, 00:18:35.920 "write_zeroes": true, 00:18:35.920 "zcopy": true, 00:18:35.920 "get_zone_info": false, 00:18:35.920 "zone_management": false, 00:18:35.920 "zone_append": false, 00:18:35.920 "compare": false, 00:18:35.920 "compare_and_write": false, 00:18:35.920 "abort": true, 00:18:35.920 "seek_hole": false, 00:18:35.920 "seek_data": false, 00:18:35.920 "copy": true, 00:18:35.920 "nvme_iov_md": false 00:18:35.920 }, 00:18:35.920 "memory_domains": [ 00:18:35.920 { 00:18:35.920 "dma_device_id": "system", 00:18:35.920 "dma_device_type": 1 00:18:35.920 }, 00:18:35.920 { 00:18:35.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.920 "dma_device_type": 2 00:18:35.920 } 00:18:35.920 ], 00:18:35.920 "driver_specific": {} 00:18:35.920 } 00:18:35.920 ] 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:36.221 "name": "Existed_Raid", 00:18:36.221 "uuid": "50e712e6-4918-4f0c-a590-270d359cf70e", 00:18:36.221 "strip_size_kb": 64, 00:18:36.221 "state": "online", 00:18:36.221 "raid_level": "raid0", 00:18:36.221 "superblock": false, 00:18:36.221 "num_base_bdevs": 4, 00:18:36.221 "num_base_bdevs_discovered": 4, 00:18:36.221 "num_base_bdevs_operational": 4, 00:18:36.221 "base_bdevs_list": [ 00:18:36.221 { 00:18:36.221 "name": "BaseBdev1", 00:18:36.221 "uuid": "75fd06ad-d3fb-4226-8cd6-bc87e4b4645d", 00:18:36.221 "is_configured": true, 00:18:36.221 "data_offset": 0, 00:18:36.221 "data_size": 65536 00:18:36.221 }, 00:18:36.221 { 00:18:36.221 "name": "BaseBdev2", 00:18:36.221 "uuid": "366be763-61e7-44dc-9006-1df9314e3880", 00:18:36.221 "is_configured": true, 00:18:36.221 "data_offset": 0, 00:18:36.221 "data_size": 65536 00:18:36.221 }, 00:18:36.221 { 00:18:36.221 "name": "BaseBdev3", 00:18:36.221 "uuid": "21209c3d-a3ba-4654-913c-9ebd970e158f", 00:18:36.221 "is_configured": true, 00:18:36.221 "data_offset": 0, 00:18:36.221 "data_size": 65536 00:18:36.221 }, 00:18:36.221 { 00:18:36.221 "name": "BaseBdev4", 00:18:36.221 "uuid": "491bd7b7-dd6e-4921-8727-b36c81b1774d", 00:18:36.221 "is_configured": true, 00:18:36.221 "data_offset": 0, 00:18:36.221 "data_size": 65536 00:18:36.221 } 00:18:36.221 ] 00:18:36.221 }' 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:36.221 20:32:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:36.787 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:36.787 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:36.787 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:36.787 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:36.787 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:36.787 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:36.787 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:36.787 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:37.045 [2024-07-15 20:32:29.363336] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:37.045 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:37.045 "name": "Existed_Raid", 00:18:37.045 "aliases": [ 00:18:37.045 "50e712e6-4918-4f0c-a590-270d359cf70e" 00:18:37.045 ], 00:18:37.045 "product_name": "Raid Volume", 00:18:37.045 "block_size": 512, 00:18:37.045 "num_blocks": 262144, 00:18:37.045 "uuid": "50e712e6-4918-4f0c-a590-270d359cf70e", 00:18:37.045 "assigned_rate_limits": { 00:18:37.045 "rw_ios_per_sec": 0, 00:18:37.045 "rw_mbytes_per_sec": 0, 00:18:37.045 "r_mbytes_per_sec": 0, 00:18:37.045 "w_mbytes_per_sec": 0 00:18:37.045 }, 00:18:37.045 "claimed": false, 00:18:37.045 "zoned": false, 00:18:37.045 "supported_io_types": { 00:18:37.045 "read": true, 00:18:37.045 "write": true, 00:18:37.045 "unmap": true, 00:18:37.045 "flush": true, 00:18:37.045 "reset": true, 00:18:37.045 "nvme_admin": false, 00:18:37.045 "nvme_io": false, 00:18:37.045 "nvme_io_md": false, 00:18:37.045 "write_zeroes": true, 00:18:37.045 "zcopy": false, 00:18:37.045 "get_zone_info": false, 00:18:37.045 "zone_management": false, 00:18:37.045 "zone_append": false, 00:18:37.045 "compare": false, 00:18:37.045 "compare_and_write": false, 00:18:37.045 "abort": false, 00:18:37.045 "seek_hole": false, 00:18:37.045 "seek_data": false, 00:18:37.045 "copy": false, 00:18:37.045 "nvme_iov_md": false 00:18:37.045 }, 00:18:37.045 "memory_domains": [ 00:18:37.045 { 00:18:37.045 "dma_device_id": "system", 00:18:37.045 "dma_device_type": 1 00:18:37.045 }, 00:18:37.045 { 00:18:37.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.045 "dma_device_type": 2 00:18:37.045 }, 00:18:37.045 { 00:18:37.045 "dma_device_id": "system", 00:18:37.045 "dma_device_type": 1 00:18:37.045 }, 00:18:37.045 { 00:18:37.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.045 "dma_device_type": 2 00:18:37.045 }, 00:18:37.045 { 00:18:37.045 "dma_device_id": "system", 00:18:37.045 "dma_device_type": 1 00:18:37.045 }, 00:18:37.045 { 00:18:37.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.045 "dma_device_type": 2 00:18:37.045 }, 00:18:37.045 { 00:18:37.045 "dma_device_id": "system", 00:18:37.045 "dma_device_type": 1 00:18:37.045 }, 00:18:37.045 { 00:18:37.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.045 "dma_device_type": 2 00:18:37.045 } 00:18:37.045 ], 00:18:37.045 "driver_specific": { 00:18:37.045 "raid": { 00:18:37.045 "uuid": "50e712e6-4918-4f0c-a590-270d359cf70e", 00:18:37.045 "strip_size_kb": 64, 00:18:37.045 "state": "online", 00:18:37.045 "raid_level": "raid0", 00:18:37.045 "superblock": false, 00:18:37.045 "num_base_bdevs": 4, 00:18:37.045 "num_base_bdevs_discovered": 4, 00:18:37.045 "num_base_bdevs_operational": 4, 00:18:37.045 "base_bdevs_list": [ 00:18:37.045 { 00:18:37.045 "name": "BaseBdev1", 00:18:37.045 "uuid": "75fd06ad-d3fb-4226-8cd6-bc87e4b4645d", 00:18:37.045 "is_configured": true, 00:18:37.045 "data_offset": 0, 00:18:37.045 "data_size": 65536 00:18:37.045 }, 00:18:37.045 { 00:18:37.045 "name": "BaseBdev2", 00:18:37.045 "uuid": "366be763-61e7-44dc-9006-1df9314e3880", 00:18:37.045 "is_configured": true, 00:18:37.045 "data_offset": 0, 00:18:37.045 "data_size": 65536 00:18:37.045 }, 00:18:37.045 { 00:18:37.045 "name": "BaseBdev3", 00:18:37.045 "uuid": "21209c3d-a3ba-4654-913c-9ebd970e158f", 00:18:37.045 "is_configured": true, 00:18:37.045 "data_offset": 0, 00:18:37.045 "data_size": 65536 00:18:37.045 }, 00:18:37.045 { 00:18:37.045 "name": "BaseBdev4", 00:18:37.045 "uuid": "491bd7b7-dd6e-4921-8727-b36c81b1774d", 00:18:37.045 "is_configured": true, 00:18:37.045 "data_offset": 0, 00:18:37.045 "data_size": 65536 00:18:37.045 } 00:18:37.045 ] 00:18:37.045 } 00:18:37.045 } 00:18:37.045 }' 00:18:37.045 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:37.304 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:37.304 BaseBdev2 00:18:37.304 BaseBdev3 00:18:37.304 BaseBdev4' 00:18:37.304 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:37.304 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:37.304 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:37.304 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:37.304 "name": "BaseBdev1", 00:18:37.304 "aliases": [ 00:18:37.304 "75fd06ad-d3fb-4226-8cd6-bc87e4b4645d" 00:18:37.304 ], 00:18:37.304 "product_name": "Malloc disk", 00:18:37.304 "block_size": 512, 00:18:37.304 "num_blocks": 65536, 00:18:37.304 "uuid": "75fd06ad-d3fb-4226-8cd6-bc87e4b4645d", 00:18:37.304 "assigned_rate_limits": { 00:18:37.304 "rw_ios_per_sec": 0, 00:18:37.304 "rw_mbytes_per_sec": 0, 00:18:37.304 "r_mbytes_per_sec": 0, 00:18:37.304 "w_mbytes_per_sec": 0 00:18:37.304 }, 00:18:37.304 "claimed": true, 00:18:37.304 "claim_type": "exclusive_write", 00:18:37.304 "zoned": false, 00:18:37.304 "supported_io_types": { 00:18:37.304 "read": true, 00:18:37.304 "write": true, 00:18:37.304 "unmap": true, 00:18:37.304 "flush": true, 00:18:37.304 "reset": true, 00:18:37.304 "nvme_admin": false, 00:18:37.304 "nvme_io": false, 00:18:37.304 "nvme_io_md": false, 00:18:37.304 "write_zeroes": true, 00:18:37.304 "zcopy": true, 00:18:37.304 "get_zone_info": false, 00:18:37.304 "zone_management": false, 00:18:37.304 "zone_append": false, 00:18:37.304 "compare": false, 00:18:37.304 "compare_and_write": false, 00:18:37.304 "abort": true, 00:18:37.304 "seek_hole": false, 00:18:37.304 "seek_data": false, 00:18:37.304 "copy": true, 00:18:37.304 "nvme_iov_md": false 00:18:37.304 }, 00:18:37.304 "memory_domains": [ 00:18:37.304 { 00:18:37.304 "dma_device_id": "system", 00:18:37.304 "dma_device_type": 1 00:18:37.304 }, 00:18:37.304 { 00:18:37.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.304 "dma_device_type": 2 00:18:37.304 } 00:18:37.304 ], 00:18:37.304 "driver_specific": {} 00:18:37.304 }' 00:18:37.304 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.562 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.562 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:37.562 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.562 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.562 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:37.562 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.562 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.562 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:37.562 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.820 20:32:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.820 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:37.820 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:37.820 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:37.820 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:38.079 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:38.079 "name": "BaseBdev2", 00:18:38.079 "aliases": [ 00:18:38.079 "366be763-61e7-44dc-9006-1df9314e3880" 00:18:38.079 ], 00:18:38.079 "product_name": "Malloc disk", 00:18:38.079 "block_size": 512, 00:18:38.079 "num_blocks": 65536, 00:18:38.079 "uuid": "366be763-61e7-44dc-9006-1df9314e3880", 00:18:38.079 "assigned_rate_limits": { 00:18:38.079 "rw_ios_per_sec": 0, 00:18:38.079 "rw_mbytes_per_sec": 0, 00:18:38.079 "r_mbytes_per_sec": 0, 00:18:38.079 "w_mbytes_per_sec": 0 00:18:38.079 }, 00:18:38.079 "claimed": true, 00:18:38.079 "claim_type": "exclusive_write", 00:18:38.079 "zoned": false, 00:18:38.079 "supported_io_types": { 00:18:38.079 "read": true, 00:18:38.079 "write": true, 00:18:38.079 "unmap": true, 00:18:38.079 "flush": true, 00:18:38.079 "reset": true, 00:18:38.079 "nvme_admin": false, 00:18:38.079 "nvme_io": false, 00:18:38.079 "nvme_io_md": false, 00:18:38.079 "write_zeroes": true, 00:18:38.079 "zcopy": true, 00:18:38.079 "get_zone_info": false, 00:18:38.079 "zone_management": false, 00:18:38.079 "zone_append": false, 00:18:38.079 "compare": false, 00:18:38.079 "compare_and_write": false, 00:18:38.079 "abort": true, 00:18:38.079 "seek_hole": false, 00:18:38.079 "seek_data": false, 00:18:38.079 "copy": true, 00:18:38.079 "nvme_iov_md": false 00:18:38.079 }, 00:18:38.079 "memory_domains": [ 00:18:38.079 { 00:18:38.079 "dma_device_id": "system", 00:18:38.079 "dma_device_type": 1 00:18:38.079 }, 00:18:38.079 { 00:18:38.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.079 "dma_device_type": 2 00:18:38.079 } 00:18:38.079 ], 00:18:38.079 "driver_specific": {} 00:18:38.079 }' 00:18:38.079 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.079 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.079 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:38.079 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.079 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.079 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:38.079 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.337 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.337 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:38.337 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.337 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.337 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:38.337 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:38.337 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:38.337 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:38.596 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:38.596 "name": "BaseBdev3", 00:18:38.596 "aliases": [ 00:18:38.596 "21209c3d-a3ba-4654-913c-9ebd970e158f" 00:18:38.596 ], 00:18:38.596 "product_name": "Malloc disk", 00:18:38.596 "block_size": 512, 00:18:38.596 "num_blocks": 65536, 00:18:38.596 "uuid": "21209c3d-a3ba-4654-913c-9ebd970e158f", 00:18:38.596 "assigned_rate_limits": { 00:18:38.596 "rw_ios_per_sec": 0, 00:18:38.596 "rw_mbytes_per_sec": 0, 00:18:38.596 "r_mbytes_per_sec": 0, 00:18:38.596 "w_mbytes_per_sec": 0 00:18:38.596 }, 00:18:38.596 "claimed": true, 00:18:38.596 "claim_type": "exclusive_write", 00:18:38.596 "zoned": false, 00:18:38.596 "supported_io_types": { 00:18:38.596 "read": true, 00:18:38.596 "write": true, 00:18:38.596 "unmap": true, 00:18:38.596 "flush": true, 00:18:38.596 "reset": true, 00:18:38.596 "nvme_admin": false, 00:18:38.596 "nvme_io": false, 00:18:38.596 "nvme_io_md": false, 00:18:38.596 "write_zeroes": true, 00:18:38.596 "zcopy": true, 00:18:38.596 "get_zone_info": false, 00:18:38.596 "zone_management": false, 00:18:38.596 "zone_append": false, 00:18:38.596 "compare": false, 00:18:38.596 "compare_and_write": false, 00:18:38.596 "abort": true, 00:18:38.596 "seek_hole": false, 00:18:38.596 "seek_data": false, 00:18:38.596 "copy": true, 00:18:38.596 "nvme_iov_md": false 00:18:38.596 }, 00:18:38.596 "memory_domains": [ 00:18:38.596 { 00:18:38.596 "dma_device_id": "system", 00:18:38.596 "dma_device_type": 1 00:18:38.596 }, 00:18:38.596 { 00:18:38.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.596 "dma_device_type": 2 00:18:38.596 } 00:18:38.596 ], 00:18:38.596 "driver_specific": {} 00:18:38.596 }' 00:18:38.596 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.596 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.596 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:38.596 20:32:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.854 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.854 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:38.854 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.854 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.854 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:38.854 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.854 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.854 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:38.854 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:38.854 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:38.855 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:39.113 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:39.113 "name": "BaseBdev4", 00:18:39.113 "aliases": [ 00:18:39.113 "491bd7b7-dd6e-4921-8727-b36c81b1774d" 00:18:39.113 ], 00:18:39.113 "product_name": "Malloc disk", 00:18:39.113 "block_size": 512, 00:18:39.113 "num_blocks": 65536, 00:18:39.113 "uuid": "491bd7b7-dd6e-4921-8727-b36c81b1774d", 00:18:39.113 "assigned_rate_limits": { 00:18:39.113 "rw_ios_per_sec": 0, 00:18:39.113 "rw_mbytes_per_sec": 0, 00:18:39.113 "r_mbytes_per_sec": 0, 00:18:39.113 "w_mbytes_per_sec": 0 00:18:39.113 }, 00:18:39.113 "claimed": true, 00:18:39.113 "claim_type": "exclusive_write", 00:18:39.113 "zoned": false, 00:18:39.113 "supported_io_types": { 00:18:39.113 "read": true, 00:18:39.113 "write": true, 00:18:39.113 "unmap": true, 00:18:39.113 "flush": true, 00:18:39.113 "reset": true, 00:18:39.113 "nvme_admin": false, 00:18:39.113 "nvme_io": false, 00:18:39.113 "nvme_io_md": false, 00:18:39.113 "write_zeroes": true, 00:18:39.113 "zcopy": true, 00:18:39.113 "get_zone_info": false, 00:18:39.113 "zone_management": false, 00:18:39.113 "zone_append": false, 00:18:39.113 "compare": false, 00:18:39.113 "compare_and_write": false, 00:18:39.113 "abort": true, 00:18:39.113 "seek_hole": false, 00:18:39.113 "seek_data": false, 00:18:39.113 "copy": true, 00:18:39.113 "nvme_iov_md": false 00:18:39.113 }, 00:18:39.113 "memory_domains": [ 00:18:39.113 { 00:18:39.113 "dma_device_id": "system", 00:18:39.113 "dma_device_type": 1 00:18:39.113 }, 00:18:39.113 { 00:18:39.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:39.113 "dma_device_type": 2 00:18:39.113 } 00:18:39.113 ], 00:18:39.113 "driver_specific": {} 00:18:39.113 }' 00:18:39.113 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.371 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:39.371 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:39.371 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.371 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.371 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:39.371 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.371 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.371 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:39.371 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.628 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.628 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:39.628 20:32:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:39.885 [2024-07-15 20:32:32.034142] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:39.885 [2024-07-15 20:32:32.034170] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:39.885 [2024-07-15 20:32:32.034218] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.885 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:40.143 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:40.143 "name": "Existed_Raid", 00:18:40.143 "uuid": "50e712e6-4918-4f0c-a590-270d359cf70e", 00:18:40.143 "strip_size_kb": 64, 00:18:40.143 "state": "offline", 00:18:40.143 "raid_level": "raid0", 00:18:40.143 "superblock": false, 00:18:40.143 "num_base_bdevs": 4, 00:18:40.143 "num_base_bdevs_discovered": 3, 00:18:40.143 "num_base_bdevs_operational": 3, 00:18:40.143 "base_bdevs_list": [ 00:18:40.143 { 00:18:40.143 "name": null, 00:18:40.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:40.143 "is_configured": false, 00:18:40.143 "data_offset": 0, 00:18:40.143 "data_size": 65536 00:18:40.143 }, 00:18:40.143 { 00:18:40.143 "name": "BaseBdev2", 00:18:40.143 "uuid": "366be763-61e7-44dc-9006-1df9314e3880", 00:18:40.143 "is_configured": true, 00:18:40.143 "data_offset": 0, 00:18:40.143 "data_size": 65536 00:18:40.143 }, 00:18:40.143 { 00:18:40.143 "name": "BaseBdev3", 00:18:40.143 "uuid": "21209c3d-a3ba-4654-913c-9ebd970e158f", 00:18:40.143 "is_configured": true, 00:18:40.143 "data_offset": 0, 00:18:40.143 "data_size": 65536 00:18:40.143 }, 00:18:40.143 { 00:18:40.143 "name": "BaseBdev4", 00:18:40.143 "uuid": "491bd7b7-dd6e-4921-8727-b36c81b1774d", 00:18:40.143 "is_configured": true, 00:18:40.143 "data_offset": 0, 00:18:40.143 "data_size": 65536 00:18:40.143 } 00:18:40.143 ] 00:18:40.143 }' 00:18:40.143 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:40.143 20:32:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.708 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:40.708 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:40.708 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:40.708 20:32:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.964 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:40.964 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:40.964 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:41.221 [2024-07-15 20:32:33.362709] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:41.221 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:41.221 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:41.221 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.221 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:41.479 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:41.479 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:41.479 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:41.735 [2024-07-15 20:32:33.866987] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:41.735 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:41.735 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:41.735 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.735 20:32:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:41.992 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:41.992 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:41.992 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:41.992 [2024-07-15 20:32:34.366863] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:41.993 [2024-07-15 20:32:34.366909] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xba3350 name Existed_Raid, state offline 00:18:42.250 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:42.250 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:42.250 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.250 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:42.507 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:42.507 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:42.507 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:42.508 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:42.508 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:42.508 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:42.508 BaseBdev2 00:18:42.508 20:32:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:42.508 20:32:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:42.508 20:32:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:42.508 20:32:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:42.508 20:32:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:42.508 20:32:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:42.508 20:32:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:42.765 20:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:43.023 [ 00:18:43.023 { 00:18:43.023 "name": "BaseBdev2", 00:18:43.023 "aliases": [ 00:18:43.023 "9648b814-d012-4edd-9dfe-91f9754cd763" 00:18:43.023 ], 00:18:43.023 "product_name": "Malloc disk", 00:18:43.023 "block_size": 512, 00:18:43.023 "num_blocks": 65536, 00:18:43.023 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:43.023 "assigned_rate_limits": { 00:18:43.023 "rw_ios_per_sec": 0, 00:18:43.023 "rw_mbytes_per_sec": 0, 00:18:43.023 "r_mbytes_per_sec": 0, 00:18:43.023 "w_mbytes_per_sec": 0 00:18:43.023 }, 00:18:43.023 "claimed": false, 00:18:43.023 "zoned": false, 00:18:43.023 "supported_io_types": { 00:18:43.023 "read": true, 00:18:43.023 "write": true, 00:18:43.023 "unmap": true, 00:18:43.023 "flush": true, 00:18:43.023 "reset": true, 00:18:43.023 "nvme_admin": false, 00:18:43.023 "nvme_io": false, 00:18:43.023 "nvme_io_md": false, 00:18:43.023 "write_zeroes": true, 00:18:43.023 "zcopy": true, 00:18:43.023 "get_zone_info": false, 00:18:43.023 "zone_management": false, 00:18:43.023 "zone_append": false, 00:18:43.023 "compare": false, 00:18:43.023 "compare_and_write": false, 00:18:43.023 "abort": true, 00:18:43.023 "seek_hole": false, 00:18:43.023 "seek_data": false, 00:18:43.023 "copy": true, 00:18:43.023 "nvme_iov_md": false 00:18:43.023 }, 00:18:43.023 "memory_domains": [ 00:18:43.023 { 00:18:43.023 "dma_device_id": "system", 00:18:43.023 "dma_device_type": 1 00:18:43.023 }, 00:18:43.023 { 00:18:43.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.023 "dma_device_type": 2 00:18:43.023 } 00:18:43.023 ], 00:18:43.023 "driver_specific": {} 00:18:43.023 } 00:18:43.023 ] 00:18:43.023 20:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:43.023 20:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:43.023 20:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:43.023 20:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:43.280 BaseBdev3 00:18:43.280 20:32:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:43.280 20:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:43.280 20:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:43.280 20:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:43.280 20:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:43.280 20:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:43.280 20:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:43.536 20:32:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:43.794 [ 00:18:43.794 { 00:18:43.794 "name": "BaseBdev3", 00:18:43.794 "aliases": [ 00:18:43.794 "ce0e7454-867e-4388-98a4-7299de907acf" 00:18:43.794 ], 00:18:43.794 "product_name": "Malloc disk", 00:18:43.794 "block_size": 512, 00:18:43.794 "num_blocks": 65536, 00:18:43.794 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:43.794 "assigned_rate_limits": { 00:18:43.794 "rw_ios_per_sec": 0, 00:18:43.794 "rw_mbytes_per_sec": 0, 00:18:43.794 "r_mbytes_per_sec": 0, 00:18:43.794 "w_mbytes_per_sec": 0 00:18:43.794 }, 00:18:43.794 "claimed": false, 00:18:43.794 "zoned": false, 00:18:43.794 "supported_io_types": { 00:18:43.794 "read": true, 00:18:43.794 "write": true, 00:18:43.794 "unmap": true, 00:18:43.794 "flush": true, 00:18:43.794 "reset": true, 00:18:43.794 "nvme_admin": false, 00:18:43.794 "nvme_io": false, 00:18:43.794 "nvme_io_md": false, 00:18:43.794 "write_zeroes": true, 00:18:43.794 "zcopy": true, 00:18:43.794 "get_zone_info": false, 00:18:43.794 "zone_management": false, 00:18:43.794 "zone_append": false, 00:18:43.794 "compare": false, 00:18:43.794 "compare_and_write": false, 00:18:43.794 "abort": true, 00:18:43.794 "seek_hole": false, 00:18:43.794 "seek_data": false, 00:18:43.794 "copy": true, 00:18:43.794 "nvme_iov_md": false 00:18:43.794 }, 00:18:43.794 "memory_domains": [ 00:18:43.794 { 00:18:43.794 "dma_device_id": "system", 00:18:43.794 "dma_device_type": 1 00:18:43.794 }, 00:18:43.794 { 00:18:43.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.794 "dma_device_type": 2 00:18:43.794 } 00:18:43.794 ], 00:18:43.794 "driver_specific": {} 00:18:43.794 } 00:18:43.794 ] 00:18:43.794 20:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:43.794 20:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:43.794 20:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:43.794 20:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:44.051 BaseBdev4 00:18:44.051 20:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:44.051 20:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:44.051 20:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:44.051 20:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:44.051 20:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:44.051 20:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:44.051 20:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:44.308 20:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:44.566 [ 00:18:44.566 { 00:18:44.566 "name": "BaseBdev4", 00:18:44.566 "aliases": [ 00:18:44.566 "500e1292-6406-4a54-8c7a-673f47f64168" 00:18:44.566 ], 00:18:44.566 "product_name": "Malloc disk", 00:18:44.566 "block_size": 512, 00:18:44.566 "num_blocks": 65536, 00:18:44.566 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:44.566 "assigned_rate_limits": { 00:18:44.566 "rw_ios_per_sec": 0, 00:18:44.566 "rw_mbytes_per_sec": 0, 00:18:44.566 "r_mbytes_per_sec": 0, 00:18:44.566 "w_mbytes_per_sec": 0 00:18:44.566 }, 00:18:44.566 "claimed": false, 00:18:44.566 "zoned": false, 00:18:44.566 "supported_io_types": { 00:18:44.566 "read": true, 00:18:44.566 "write": true, 00:18:44.566 "unmap": true, 00:18:44.566 "flush": true, 00:18:44.566 "reset": true, 00:18:44.566 "nvme_admin": false, 00:18:44.566 "nvme_io": false, 00:18:44.566 "nvme_io_md": false, 00:18:44.566 "write_zeroes": true, 00:18:44.566 "zcopy": true, 00:18:44.566 "get_zone_info": false, 00:18:44.566 "zone_management": false, 00:18:44.566 "zone_append": false, 00:18:44.566 "compare": false, 00:18:44.566 "compare_and_write": false, 00:18:44.566 "abort": true, 00:18:44.566 "seek_hole": false, 00:18:44.566 "seek_data": false, 00:18:44.566 "copy": true, 00:18:44.566 "nvme_iov_md": false 00:18:44.566 }, 00:18:44.566 "memory_domains": [ 00:18:44.566 { 00:18:44.566 "dma_device_id": "system", 00:18:44.566 "dma_device_type": 1 00:18:44.566 }, 00:18:44.566 { 00:18:44.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.566 "dma_device_type": 2 00:18:44.566 } 00:18:44.566 ], 00:18:44.566 "driver_specific": {} 00:18:44.566 } 00:18:44.566 ] 00:18:44.566 20:32:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:44.566 20:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:44.566 20:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:44.566 20:32:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:44.824 [2024-07-15 20:32:37.072871] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:44.824 [2024-07-15 20:32:37.072911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:44.824 [2024-07-15 20:32:37.072937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:44.824 [2024-07-15 20:32:37.074260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:44.824 [2024-07-15 20:32:37.074300] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.824 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:45.082 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.082 "name": "Existed_Raid", 00:18:45.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.082 "strip_size_kb": 64, 00:18:45.082 "state": "configuring", 00:18:45.082 "raid_level": "raid0", 00:18:45.082 "superblock": false, 00:18:45.082 "num_base_bdevs": 4, 00:18:45.082 "num_base_bdevs_discovered": 3, 00:18:45.082 "num_base_bdevs_operational": 4, 00:18:45.082 "base_bdevs_list": [ 00:18:45.082 { 00:18:45.082 "name": "BaseBdev1", 00:18:45.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.082 "is_configured": false, 00:18:45.082 "data_offset": 0, 00:18:45.082 "data_size": 0 00:18:45.082 }, 00:18:45.082 { 00:18:45.082 "name": "BaseBdev2", 00:18:45.082 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:45.082 "is_configured": true, 00:18:45.082 "data_offset": 0, 00:18:45.082 "data_size": 65536 00:18:45.082 }, 00:18:45.082 { 00:18:45.082 "name": "BaseBdev3", 00:18:45.082 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:45.082 "is_configured": true, 00:18:45.082 "data_offset": 0, 00:18:45.082 "data_size": 65536 00:18:45.082 }, 00:18:45.082 { 00:18:45.082 "name": "BaseBdev4", 00:18:45.082 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:45.082 "is_configured": true, 00:18:45.082 "data_offset": 0, 00:18:45.082 "data_size": 65536 00:18:45.082 } 00:18:45.082 ] 00:18:45.082 }' 00:18:45.082 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.082 20:32:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.647 20:32:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:45.906 [2024-07-15 20:32:38.163743] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.906 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:46.164 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.164 "name": "Existed_Raid", 00:18:46.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.164 "strip_size_kb": 64, 00:18:46.164 "state": "configuring", 00:18:46.164 "raid_level": "raid0", 00:18:46.164 "superblock": false, 00:18:46.164 "num_base_bdevs": 4, 00:18:46.164 "num_base_bdevs_discovered": 2, 00:18:46.164 "num_base_bdevs_operational": 4, 00:18:46.164 "base_bdevs_list": [ 00:18:46.164 { 00:18:46.164 "name": "BaseBdev1", 00:18:46.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.164 "is_configured": false, 00:18:46.164 "data_offset": 0, 00:18:46.164 "data_size": 0 00:18:46.164 }, 00:18:46.164 { 00:18:46.164 "name": null, 00:18:46.164 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:46.164 "is_configured": false, 00:18:46.164 "data_offset": 0, 00:18:46.164 "data_size": 65536 00:18:46.164 }, 00:18:46.164 { 00:18:46.164 "name": "BaseBdev3", 00:18:46.164 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:46.164 "is_configured": true, 00:18:46.164 "data_offset": 0, 00:18:46.164 "data_size": 65536 00:18:46.164 }, 00:18:46.164 { 00:18:46.165 "name": "BaseBdev4", 00:18:46.165 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:46.165 "is_configured": true, 00:18:46.165 "data_offset": 0, 00:18:46.165 "data_size": 65536 00:18:46.165 } 00:18:46.165 ] 00:18:46.165 }' 00:18:46.165 20:32:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.165 20:32:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:46.731 20:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.731 20:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:46.991 20:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:46.991 20:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:47.250 [2024-07-15 20:32:39.519919] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:47.250 BaseBdev1 00:18:47.250 20:32:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:47.250 20:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:47.250 20:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:47.250 20:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:47.250 20:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:47.250 20:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:47.250 20:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:47.509 20:32:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:47.767 [ 00:18:47.767 { 00:18:47.767 "name": "BaseBdev1", 00:18:47.767 "aliases": [ 00:18:47.767 "74980ed7-25e6-4e41-b94f-423156274b6c" 00:18:47.767 ], 00:18:47.767 "product_name": "Malloc disk", 00:18:47.767 "block_size": 512, 00:18:47.767 "num_blocks": 65536, 00:18:47.767 "uuid": "74980ed7-25e6-4e41-b94f-423156274b6c", 00:18:47.767 "assigned_rate_limits": { 00:18:47.767 "rw_ios_per_sec": 0, 00:18:47.767 "rw_mbytes_per_sec": 0, 00:18:47.767 "r_mbytes_per_sec": 0, 00:18:47.767 "w_mbytes_per_sec": 0 00:18:47.767 }, 00:18:47.767 "claimed": true, 00:18:47.767 "claim_type": "exclusive_write", 00:18:47.767 "zoned": false, 00:18:47.767 "supported_io_types": { 00:18:47.767 "read": true, 00:18:47.767 "write": true, 00:18:47.767 "unmap": true, 00:18:47.767 "flush": true, 00:18:47.767 "reset": true, 00:18:47.767 "nvme_admin": false, 00:18:47.767 "nvme_io": false, 00:18:47.767 "nvme_io_md": false, 00:18:47.767 "write_zeroes": true, 00:18:47.767 "zcopy": true, 00:18:47.767 "get_zone_info": false, 00:18:47.767 "zone_management": false, 00:18:47.767 "zone_append": false, 00:18:47.767 "compare": false, 00:18:47.767 "compare_and_write": false, 00:18:47.767 "abort": true, 00:18:47.767 "seek_hole": false, 00:18:47.767 "seek_data": false, 00:18:47.767 "copy": true, 00:18:47.767 "nvme_iov_md": false 00:18:47.767 }, 00:18:47.767 "memory_domains": [ 00:18:47.767 { 00:18:47.767 "dma_device_id": "system", 00:18:47.767 "dma_device_type": 1 00:18:47.767 }, 00:18:47.767 { 00:18:47.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.767 "dma_device_type": 2 00:18:47.767 } 00:18:47.767 ], 00:18:47.767 "driver_specific": {} 00:18:47.767 } 00:18:47.767 ] 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.767 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:48.033 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:48.033 "name": "Existed_Raid", 00:18:48.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.033 "strip_size_kb": 64, 00:18:48.033 "state": "configuring", 00:18:48.033 "raid_level": "raid0", 00:18:48.033 "superblock": false, 00:18:48.033 "num_base_bdevs": 4, 00:18:48.033 "num_base_bdevs_discovered": 3, 00:18:48.033 "num_base_bdevs_operational": 4, 00:18:48.033 "base_bdevs_list": [ 00:18:48.033 { 00:18:48.033 "name": "BaseBdev1", 00:18:48.033 "uuid": "74980ed7-25e6-4e41-b94f-423156274b6c", 00:18:48.033 "is_configured": true, 00:18:48.033 "data_offset": 0, 00:18:48.033 "data_size": 65536 00:18:48.033 }, 00:18:48.033 { 00:18:48.033 "name": null, 00:18:48.033 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:48.033 "is_configured": false, 00:18:48.033 "data_offset": 0, 00:18:48.033 "data_size": 65536 00:18:48.033 }, 00:18:48.033 { 00:18:48.033 "name": "BaseBdev3", 00:18:48.033 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:48.033 "is_configured": true, 00:18:48.033 "data_offset": 0, 00:18:48.033 "data_size": 65536 00:18:48.033 }, 00:18:48.033 { 00:18:48.033 "name": "BaseBdev4", 00:18:48.033 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:48.033 "is_configured": true, 00:18:48.033 "data_offset": 0, 00:18:48.033 "data_size": 65536 00:18:48.033 } 00:18:48.033 ] 00:18:48.033 }' 00:18:48.033 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:48.033 20:32:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:48.601 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.601 20:32:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:48.859 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:48.859 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:49.427 [2024-07-15 20:32:41.625539] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:49.427 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.685 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:49.685 "name": "Existed_Raid", 00:18:49.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:49.685 "strip_size_kb": 64, 00:18:49.685 "state": "configuring", 00:18:49.685 "raid_level": "raid0", 00:18:49.685 "superblock": false, 00:18:49.685 "num_base_bdevs": 4, 00:18:49.685 "num_base_bdevs_discovered": 2, 00:18:49.685 "num_base_bdevs_operational": 4, 00:18:49.685 "base_bdevs_list": [ 00:18:49.685 { 00:18:49.685 "name": "BaseBdev1", 00:18:49.685 "uuid": "74980ed7-25e6-4e41-b94f-423156274b6c", 00:18:49.685 "is_configured": true, 00:18:49.685 "data_offset": 0, 00:18:49.685 "data_size": 65536 00:18:49.685 }, 00:18:49.685 { 00:18:49.685 "name": null, 00:18:49.685 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:49.685 "is_configured": false, 00:18:49.685 "data_offset": 0, 00:18:49.685 "data_size": 65536 00:18:49.685 }, 00:18:49.685 { 00:18:49.685 "name": null, 00:18:49.685 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:49.685 "is_configured": false, 00:18:49.685 "data_offset": 0, 00:18:49.685 "data_size": 65536 00:18:49.685 }, 00:18:49.685 { 00:18:49.685 "name": "BaseBdev4", 00:18:49.685 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:49.685 "is_configured": true, 00:18:49.685 "data_offset": 0, 00:18:49.685 "data_size": 65536 00:18:49.685 } 00:18:49.685 ] 00:18:49.685 }' 00:18:49.685 20:32:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:49.685 20:32:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:50.280 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:50.280 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.552 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:50.552 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:50.812 [2024-07-15 20:32:42.977137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.812 20:32:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:51.081 20:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.081 "name": "Existed_Raid", 00:18:51.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:51.081 "strip_size_kb": 64, 00:18:51.081 "state": "configuring", 00:18:51.081 "raid_level": "raid0", 00:18:51.081 "superblock": false, 00:18:51.081 "num_base_bdevs": 4, 00:18:51.081 "num_base_bdevs_discovered": 3, 00:18:51.081 "num_base_bdevs_operational": 4, 00:18:51.081 "base_bdevs_list": [ 00:18:51.081 { 00:18:51.081 "name": "BaseBdev1", 00:18:51.081 "uuid": "74980ed7-25e6-4e41-b94f-423156274b6c", 00:18:51.081 "is_configured": true, 00:18:51.081 "data_offset": 0, 00:18:51.081 "data_size": 65536 00:18:51.081 }, 00:18:51.081 { 00:18:51.081 "name": null, 00:18:51.081 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:51.081 "is_configured": false, 00:18:51.081 "data_offset": 0, 00:18:51.081 "data_size": 65536 00:18:51.081 }, 00:18:51.081 { 00:18:51.081 "name": "BaseBdev3", 00:18:51.081 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:51.081 "is_configured": true, 00:18:51.081 "data_offset": 0, 00:18:51.081 "data_size": 65536 00:18:51.081 }, 00:18:51.081 { 00:18:51.081 "name": "BaseBdev4", 00:18:51.081 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:51.081 "is_configured": true, 00:18:51.081 "data_offset": 0, 00:18:51.081 "data_size": 65536 00:18:51.081 } 00:18:51.081 ] 00:18:51.081 }' 00:18:51.081 20:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.081 20:32:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.654 20:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.654 20:32:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:51.912 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:51.912 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:52.171 [2024-07-15 20:32:44.312699] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.171 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:52.430 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.430 "name": "Existed_Raid", 00:18:52.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.430 "strip_size_kb": 64, 00:18:52.430 "state": "configuring", 00:18:52.430 "raid_level": "raid0", 00:18:52.430 "superblock": false, 00:18:52.430 "num_base_bdevs": 4, 00:18:52.430 "num_base_bdevs_discovered": 2, 00:18:52.430 "num_base_bdevs_operational": 4, 00:18:52.430 "base_bdevs_list": [ 00:18:52.430 { 00:18:52.430 "name": null, 00:18:52.430 "uuid": "74980ed7-25e6-4e41-b94f-423156274b6c", 00:18:52.430 "is_configured": false, 00:18:52.430 "data_offset": 0, 00:18:52.430 "data_size": 65536 00:18:52.430 }, 00:18:52.430 { 00:18:52.430 "name": null, 00:18:52.430 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:52.430 "is_configured": false, 00:18:52.430 "data_offset": 0, 00:18:52.430 "data_size": 65536 00:18:52.430 }, 00:18:52.430 { 00:18:52.430 "name": "BaseBdev3", 00:18:52.430 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:52.430 "is_configured": true, 00:18:52.430 "data_offset": 0, 00:18:52.430 "data_size": 65536 00:18:52.430 }, 00:18:52.430 { 00:18:52.430 "name": "BaseBdev4", 00:18:52.430 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:52.430 "is_configured": true, 00:18:52.430 "data_offset": 0, 00:18:52.430 "data_size": 65536 00:18:52.430 } 00:18:52.430 ] 00:18:52.430 }' 00:18:52.430 20:32:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.430 20:32:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:52.999 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.999 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:52.999 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:52.999 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:53.258 [2024-07-15 20:32:45.575418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.258 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:53.518 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.518 "name": "Existed_Raid", 00:18:53.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.518 "strip_size_kb": 64, 00:18:53.518 "state": "configuring", 00:18:53.518 "raid_level": "raid0", 00:18:53.518 "superblock": false, 00:18:53.518 "num_base_bdevs": 4, 00:18:53.518 "num_base_bdevs_discovered": 3, 00:18:53.518 "num_base_bdevs_operational": 4, 00:18:53.518 "base_bdevs_list": [ 00:18:53.518 { 00:18:53.518 "name": null, 00:18:53.518 "uuid": "74980ed7-25e6-4e41-b94f-423156274b6c", 00:18:53.518 "is_configured": false, 00:18:53.518 "data_offset": 0, 00:18:53.518 "data_size": 65536 00:18:53.518 }, 00:18:53.518 { 00:18:53.518 "name": "BaseBdev2", 00:18:53.518 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:53.518 "is_configured": true, 00:18:53.518 "data_offset": 0, 00:18:53.518 "data_size": 65536 00:18:53.518 }, 00:18:53.518 { 00:18:53.518 "name": "BaseBdev3", 00:18:53.518 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:53.518 "is_configured": true, 00:18:53.518 "data_offset": 0, 00:18:53.518 "data_size": 65536 00:18:53.518 }, 00:18:53.518 { 00:18:53.518 "name": "BaseBdev4", 00:18:53.518 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:53.518 "is_configured": true, 00:18:53.518 "data_offset": 0, 00:18:53.518 "data_size": 65536 00:18:53.518 } 00:18:53.518 ] 00:18:53.518 }' 00:18:53.518 20:32:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.518 20:32:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:54.086 20:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.086 20:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:54.346 20:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:54.346 20:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:54.346 20:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.604 20:32:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 74980ed7-25e6-4e41-b94f-423156274b6c 00:18:55.172 [2024-07-15 20:32:47.448987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:55.172 [2024-07-15 20:32:47.449026] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xba7040 00:18:55.172 [2024-07-15 20:32:47.449035] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:55.172 [2024-07-15 20:32:47.449233] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xba2a70 00:18:55.172 [2024-07-15 20:32:47.449348] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xba7040 00:18:55.172 [2024-07-15 20:32:47.449358] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xba7040 00:18:55.172 [2024-07-15 20:32:47.449522] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:55.172 NewBaseBdev 00:18:55.172 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:55.172 20:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:55.172 20:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:55.172 20:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:55.172 20:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:55.172 20:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:55.172 20:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:55.432 20:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:55.691 [ 00:18:55.691 { 00:18:55.691 "name": "NewBaseBdev", 00:18:55.691 "aliases": [ 00:18:55.691 "74980ed7-25e6-4e41-b94f-423156274b6c" 00:18:55.691 ], 00:18:55.691 "product_name": "Malloc disk", 00:18:55.691 "block_size": 512, 00:18:55.691 "num_blocks": 65536, 00:18:55.691 "uuid": "74980ed7-25e6-4e41-b94f-423156274b6c", 00:18:55.691 "assigned_rate_limits": { 00:18:55.691 "rw_ios_per_sec": 0, 00:18:55.691 "rw_mbytes_per_sec": 0, 00:18:55.691 "r_mbytes_per_sec": 0, 00:18:55.691 "w_mbytes_per_sec": 0 00:18:55.691 }, 00:18:55.691 "claimed": true, 00:18:55.691 "claim_type": "exclusive_write", 00:18:55.691 "zoned": false, 00:18:55.691 "supported_io_types": { 00:18:55.691 "read": true, 00:18:55.691 "write": true, 00:18:55.691 "unmap": true, 00:18:55.691 "flush": true, 00:18:55.691 "reset": true, 00:18:55.691 "nvme_admin": false, 00:18:55.691 "nvme_io": false, 00:18:55.691 "nvme_io_md": false, 00:18:55.691 "write_zeroes": true, 00:18:55.691 "zcopy": true, 00:18:55.691 "get_zone_info": false, 00:18:55.691 "zone_management": false, 00:18:55.691 "zone_append": false, 00:18:55.691 "compare": false, 00:18:55.691 "compare_and_write": false, 00:18:55.691 "abort": true, 00:18:55.691 "seek_hole": false, 00:18:55.691 "seek_data": false, 00:18:55.691 "copy": true, 00:18:55.691 "nvme_iov_md": false 00:18:55.691 }, 00:18:55.691 "memory_domains": [ 00:18:55.691 { 00:18:55.691 "dma_device_id": "system", 00:18:55.691 "dma_device_type": 1 00:18:55.691 }, 00:18:55.691 { 00:18:55.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.691 "dma_device_type": 2 00:18:55.691 } 00:18:55.691 ], 00:18:55.691 "driver_specific": {} 00:18:55.691 } 00:18:55.691 ] 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.691 20:32:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:55.951 20:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:55.951 "name": "Existed_Raid", 00:18:55.951 "uuid": "a336c7af-d2ae-4f14-a718-37ca5553a2b3", 00:18:55.951 "strip_size_kb": 64, 00:18:55.951 "state": "online", 00:18:55.951 "raid_level": "raid0", 00:18:55.951 "superblock": false, 00:18:55.951 "num_base_bdevs": 4, 00:18:55.951 "num_base_bdevs_discovered": 4, 00:18:55.951 "num_base_bdevs_operational": 4, 00:18:55.951 "base_bdevs_list": [ 00:18:55.951 { 00:18:55.951 "name": "NewBaseBdev", 00:18:55.951 "uuid": "74980ed7-25e6-4e41-b94f-423156274b6c", 00:18:55.951 "is_configured": true, 00:18:55.951 "data_offset": 0, 00:18:55.951 "data_size": 65536 00:18:55.951 }, 00:18:55.951 { 00:18:55.951 "name": "BaseBdev2", 00:18:55.951 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:55.951 "is_configured": true, 00:18:55.951 "data_offset": 0, 00:18:55.951 "data_size": 65536 00:18:55.951 }, 00:18:55.951 { 00:18:55.951 "name": "BaseBdev3", 00:18:55.951 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:55.951 "is_configured": true, 00:18:55.951 "data_offset": 0, 00:18:55.951 "data_size": 65536 00:18:55.951 }, 00:18:55.951 { 00:18:55.951 "name": "BaseBdev4", 00:18:55.951 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:55.951 "is_configured": true, 00:18:55.951 "data_offset": 0, 00:18:55.951 "data_size": 65536 00:18:55.951 } 00:18:55.951 ] 00:18:55.951 }' 00:18:55.951 20:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:55.951 20:32:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.518 20:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:56.518 20:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:56.518 20:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:56.518 20:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:56.518 20:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:56.518 20:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:56.518 20:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:56.518 20:32:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:56.777 [2024-07-15 20:32:49.037526] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:56.777 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:56.777 "name": "Existed_Raid", 00:18:56.777 "aliases": [ 00:18:56.777 "a336c7af-d2ae-4f14-a718-37ca5553a2b3" 00:18:56.777 ], 00:18:56.777 "product_name": "Raid Volume", 00:18:56.777 "block_size": 512, 00:18:56.777 "num_blocks": 262144, 00:18:56.777 "uuid": "a336c7af-d2ae-4f14-a718-37ca5553a2b3", 00:18:56.777 "assigned_rate_limits": { 00:18:56.777 "rw_ios_per_sec": 0, 00:18:56.777 "rw_mbytes_per_sec": 0, 00:18:56.777 "r_mbytes_per_sec": 0, 00:18:56.777 "w_mbytes_per_sec": 0 00:18:56.777 }, 00:18:56.777 "claimed": false, 00:18:56.777 "zoned": false, 00:18:56.777 "supported_io_types": { 00:18:56.777 "read": true, 00:18:56.777 "write": true, 00:18:56.777 "unmap": true, 00:18:56.777 "flush": true, 00:18:56.777 "reset": true, 00:18:56.777 "nvme_admin": false, 00:18:56.777 "nvme_io": false, 00:18:56.777 "nvme_io_md": false, 00:18:56.777 "write_zeroes": true, 00:18:56.777 "zcopy": false, 00:18:56.777 "get_zone_info": false, 00:18:56.777 "zone_management": false, 00:18:56.777 "zone_append": false, 00:18:56.777 "compare": false, 00:18:56.777 "compare_and_write": false, 00:18:56.778 "abort": false, 00:18:56.778 "seek_hole": false, 00:18:56.778 "seek_data": false, 00:18:56.778 "copy": false, 00:18:56.778 "nvme_iov_md": false 00:18:56.778 }, 00:18:56.778 "memory_domains": [ 00:18:56.778 { 00:18:56.778 "dma_device_id": "system", 00:18:56.778 "dma_device_type": 1 00:18:56.778 }, 00:18:56.778 { 00:18:56.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.778 "dma_device_type": 2 00:18:56.778 }, 00:18:56.778 { 00:18:56.778 "dma_device_id": "system", 00:18:56.778 "dma_device_type": 1 00:18:56.778 }, 00:18:56.778 { 00:18:56.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.778 "dma_device_type": 2 00:18:56.778 }, 00:18:56.778 { 00:18:56.778 "dma_device_id": "system", 00:18:56.778 "dma_device_type": 1 00:18:56.778 }, 00:18:56.778 { 00:18:56.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.778 "dma_device_type": 2 00:18:56.778 }, 00:18:56.778 { 00:18:56.778 "dma_device_id": "system", 00:18:56.778 "dma_device_type": 1 00:18:56.778 }, 00:18:56.778 { 00:18:56.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.778 "dma_device_type": 2 00:18:56.778 } 00:18:56.778 ], 00:18:56.778 "driver_specific": { 00:18:56.778 "raid": { 00:18:56.778 "uuid": "a336c7af-d2ae-4f14-a718-37ca5553a2b3", 00:18:56.778 "strip_size_kb": 64, 00:18:56.778 "state": "online", 00:18:56.778 "raid_level": "raid0", 00:18:56.778 "superblock": false, 00:18:56.778 "num_base_bdevs": 4, 00:18:56.778 "num_base_bdevs_discovered": 4, 00:18:56.778 "num_base_bdevs_operational": 4, 00:18:56.778 "base_bdevs_list": [ 00:18:56.778 { 00:18:56.778 "name": "NewBaseBdev", 00:18:56.778 "uuid": "74980ed7-25e6-4e41-b94f-423156274b6c", 00:18:56.778 "is_configured": true, 00:18:56.778 "data_offset": 0, 00:18:56.778 "data_size": 65536 00:18:56.778 }, 00:18:56.778 { 00:18:56.778 "name": "BaseBdev2", 00:18:56.778 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:56.778 "is_configured": true, 00:18:56.778 "data_offset": 0, 00:18:56.778 "data_size": 65536 00:18:56.778 }, 00:18:56.778 { 00:18:56.778 "name": "BaseBdev3", 00:18:56.778 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:56.778 "is_configured": true, 00:18:56.778 "data_offset": 0, 00:18:56.778 "data_size": 65536 00:18:56.778 }, 00:18:56.778 { 00:18:56.778 "name": "BaseBdev4", 00:18:56.778 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:56.778 "is_configured": true, 00:18:56.778 "data_offset": 0, 00:18:56.778 "data_size": 65536 00:18:56.778 } 00:18:56.778 ] 00:18:56.778 } 00:18:56.778 } 00:18:56.778 }' 00:18:56.778 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:56.778 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:56.778 BaseBdev2 00:18:56.778 BaseBdev3 00:18:56.778 BaseBdev4' 00:18:56.778 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:56.778 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:56.778 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:57.037 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:57.037 "name": "NewBaseBdev", 00:18:57.037 "aliases": [ 00:18:57.037 "74980ed7-25e6-4e41-b94f-423156274b6c" 00:18:57.037 ], 00:18:57.037 "product_name": "Malloc disk", 00:18:57.037 "block_size": 512, 00:18:57.037 "num_blocks": 65536, 00:18:57.037 "uuid": "74980ed7-25e6-4e41-b94f-423156274b6c", 00:18:57.037 "assigned_rate_limits": { 00:18:57.037 "rw_ios_per_sec": 0, 00:18:57.037 "rw_mbytes_per_sec": 0, 00:18:57.037 "r_mbytes_per_sec": 0, 00:18:57.037 "w_mbytes_per_sec": 0 00:18:57.037 }, 00:18:57.037 "claimed": true, 00:18:57.037 "claim_type": "exclusive_write", 00:18:57.037 "zoned": false, 00:18:57.037 "supported_io_types": { 00:18:57.037 "read": true, 00:18:57.037 "write": true, 00:18:57.037 "unmap": true, 00:18:57.037 "flush": true, 00:18:57.037 "reset": true, 00:18:57.037 "nvme_admin": false, 00:18:57.037 "nvme_io": false, 00:18:57.037 "nvme_io_md": false, 00:18:57.037 "write_zeroes": true, 00:18:57.037 "zcopy": true, 00:18:57.037 "get_zone_info": false, 00:18:57.037 "zone_management": false, 00:18:57.037 "zone_append": false, 00:18:57.037 "compare": false, 00:18:57.037 "compare_and_write": false, 00:18:57.037 "abort": true, 00:18:57.037 "seek_hole": false, 00:18:57.037 "seek_data": false, 00:18:57.037 "copy": true, 00:18:57.037 "nvme_iov_md": false 00:18:57.037 }, 00:18:57.037 "memory_domains": [ 00:18:57.037 { 00:18:57.037 "dma_device_id": "system", 00:18:57.037 "dma_device_type": 1 00:18:57.037 }, 00:18:57.037 { 00:18:57.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.037 "dma_device_type": 2 00:18:57.037 } 00:18:57.037 ], 00:18:57.037 "driver_specific": {} 00:18:57.037 }' 00:18:57.037 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.037 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.296 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:57.296 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.296 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.296 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:57.296 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.296 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.296 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:57.296 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:57.296 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:57.554 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:57.554 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:57.554 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:57.554 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:57.813 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:57.813 "name": "BaseBdev2", 00:18:57.813 "aliases": [ 00:18:57.813 "9648b814-d012-4edd-9dfe-91f9754cd763" 00:18:57.813 ], 00:18:57.813 "product_name": "Malloc disk", 00:18:57.813 "block_size": 512, 00:18:57.813 "num_blocks": 65536, 00:18:57.813 "uuid": "9648b814-d012-4edd-9dfe-91f9754cd763", 00:18:57.813 "assigned_rate_limits": { 00:18:57.813 "rw_ios_per_sec": 0, 00:18:57.813 "rw_mbytes_per_sec": 0, 00:18:57.813 "r_mbytes_per_sec": 0, 00:18:57.813 "w_mbytes_per_sec": 0 00:18:57.813 }, 00:18:57.813 "claimed": true, 00:18:57.813 "claim_type": "exclusive_write", 00:18:57.813 "zoned": false, 00:18:57.813 "supported_io_types": { 00:18:57.813 "read": true, 00:18:57.813 "write": true, 00:18:57.813 "unmap": true, 00:18:57.813 "flush": true, 00:18:57.813 "reset": true, 00:18:57.813 "nvme_admin": false, 00:18:57.813 "nvme_io": false, 00:18:57.813 "nvme_io_md": false, 00:18:57.813 "write_zeroes": true, 00:18:57.813 "zcopy": true, 00:18:57.813 "get_zone_info": false, 00:18:57.813 "zone_management": false, 00:18:57.813 "zone_append": false, 00:18:57.813 "compare": false, 00:18:57.813 "compare_and_write": false, 00:18:57.813 "abort": true, 00:18:57.813 "seek_hole": false, 00:18:57.813 "seek_data": false, 00:18:57.813 "copy": true, 00:18:57.813 "nvme_iov_md": false 00:18:57.813 }, 00:18:57.813 "memory_domains": [ 00:18:57.813 { 00:18:57.813 "dma_device_id": "system", 00:18:57.813 "dma_device_type": 1 00:18:57.813 }, 00:18:57.813 { 00:18:57.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.813 "dma_device_type": 2 00:18:57.813 } 00:18:57.813 ], 00:18:57.813 "driver_specific": {} 00:18:57.813 }' 00:18:57.813 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.813 20:32:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:57.813 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:57.813 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.813 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:57.813 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:57.813 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:57.813 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.071 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:58.071 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.071 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.071 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:58.071 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:58.071 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:58.071 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:58.330 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:58.330 "name": "BaseBdev3", 00:18:58.330 "aliases": [ 00:18:58.330 "ce0e7454-867e-4388-98a4-7299de907acf" 00:18:58.330 ], 00:18:58.330 "product_name": "Malloc disk", 00:18:58.330 "block_size": 512, 00:18:58.330 "num_blocks": 65536, 00:18:58.330 "uuid": "ce0e7454-867e-4388-98a4-7299de907acf", 00:18:58.330 "assigned_rate_limits": { 00:18:58.330 "rw_ios_per_sec": 0, 00:18:58.330 "rw_mbytes_per_sec": 0, 00:18:58.330 "r_mbytes_per_sec": 0, 00:18:58.330 "w_mbytes_per_sec": 0 00:18:58.330 }, 00:18:58.330 "claimed": true, 00:18:58.330 "claim_type": "exclusive_write", 00:18:58.330 "zoned": false, 00:18:58.330 "supported_io_types": { 00:18:58.330 "read": true, 00:18:58.330 "write": true, 00:18:58.330 "unmap": true, 00:18:58.330 "flush": true, 00:18:58.330 "reset": true, 00:18:58.330 "nvme_admin": false, 00:18:58.330 "nvme_io": false, 00:18:58.330 "nvme_io_md": false, 00:18:58.330 "write_zeroes": true, 00:18:58.330 "zcopy": true, 00:18:58.330 "get_zone_info": false, 00:18:58.330 "zone_management": false, 00:18:58.330 "zone_append": false, 00:18:58.330 "compare": false, 00:18:58.330 "compare_and_write": false, 00:18:58.330 "abort": true, 00:18:58.330 "seek_hole": false, 00:18:58.330 "seek_data": false, 00:18:58.330 "copy": true, 00:18:58.330 "nvme_iov_md": false 00:18:58.330 }, 00:18:58.330 "memory_domains": [ 00:18:58.330 { 00:18:58.330 "dma_device_id": "system", 00:18:58.330 "dma_device_type": 1 00:18:58.330 }, 00:18:58.330 { 00:18:58.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.330 "dma_device_type": 2 00:18:58.330 } 00:18:58.330 ], 00:18:58.330 "driver_specific": {} 00:18:58.330 }' 00:18:58.330 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.330 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:58.330 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:58.330 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.589 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:58.589 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:58.589 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.589 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:58.589 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:58.589 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.589 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:58.589 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:58.589 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:58.589 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:58.847 20:32:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:58.847 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:58.847 "name": "BaseBdev4", 00:18:58.847 "aliases": [ 00:18:58.847 "500e1292-6406-4a54-8c7a-673f47f64168" 00:18:58.847 ], 00:18:58.847 "product_name": "Malloc disk", 00:18:58.847 "block_size": 512, 00:18:58.847 "num_blocks": 65536, 00:18:58.847 "uuid": "500e1292-6406-4a54-8c7a-673f47f64168", 00:18:58.847 "assigned_rate_limits": { 00:18:58.847 "rw_ios_per_sec": 0, 00:18:58.847 "rw_mbytes_per_sec": 0, 00:18:58.847 "r_mbytes_per_sec": 0, 00:18:58.847 "w_mbytes_per_sec": 0 00:18:58.847 }, 00:18:58.847 "claimed": true, 00:18:58.847 "claim_type": "exclusive_write", 00:18:58.847 "zoned": false, 00:18:58.847 "supported_io_types": { 00:18:58.847 "read": true, 00:18:58.847 "write": true, 00:18:58.847 "unmap": true, 00:18:58.847 "flush": true, 00:18:58.847 "reset": true, 00:18:58.847 "nvme_admin": false, 00:18:58.847 "nvme_io": false, 00:18:58.847 "nvme_io_md": false, 00:18:58.847 "write_zeroes": true, 00:18:58.847 "zcopy": true, 00:18:58.847 "get_zone_info": false, 00:18:58.847 "zone_management": false, 00:18:58.847 "zone_append": false, 00:18:58.847 "compare": false, 00:18:58.847 "compare_and_write": false, 00:18:58.847 "abort": true, 00:18:58.847 "seek_hole": false, 00:18:58.847 "seek_data": false, 00:18:58.847 "copy": true, 00:18:58.847 "nvme_iov_md": false 00:18:58.847 }, 00:18:58.847 "memory_domains": [ 00:18:58.847 { 00:18:58.847 "dma_device_id": "system", 00:18:58.847 "dma_device_type": 1 00:18:58.847 }, 00:18:58.847 { 00:18:58.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.847 "dma_device_type": 2 00:18:58.847 } 00:18:58.847 ], 00:18:58.847 "driver_specific": {} 00:18:58.847 }' 00:18:58.847 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:59.106 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:59.106 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:59.106 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.106 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:59.106 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:59.106 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.106 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:59.364 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:59.364 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.364 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:59.364 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:59.364 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:59.624 [2024-07-15 20:32:51.792524] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:59.624 [2024-07-15 20:32:51.792549] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:59.624 [2024-07-15 20:32:51.792604] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:59.624 [2024-07-15 20:32:51.792662] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:59.624 [2024-07-15 20:32:51.792675] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xba7040 name Existed_Raid, state offline 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1412708 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1412708 ']' 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1412708 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1412708 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1412708' 00:18:59.624 killing process with pid 1412708 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1412708 00:18:59.624 [2024-07-15 20:32:51.865714] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:59.624 20:32:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1412708 00:18:59.624 [2024-07-15 20:32:51.902977] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:59.883 00:18:59.883 real 0m34.050s 00:18:59.883 user 1m2.500s 00:18:59.883 sys 0m6.054s 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:59.883 ************************************ 00:18:59.883 END TEST raid_state_function_test 00:18:59.883 ************************************ 00:18:59.883 20:32:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:59.883 20:32:52 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:59.883 20:32:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:59.883 20:32:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:59.883 20:32:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:59.883 ************************************ 00:18:59.883 START TEST raid_state_function_test_sb 00:18:59.883 ************************************ 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:59.883 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1417759 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1417759' 00:18:59.884 Process raid pid: 1417759 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1417759 /var/tmp/spdk-raid.sock 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1417759 ']' 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:59.884 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:59.884 20:32:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:00.143 [2024-07-15 20:32:52.291055] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:19:00.143 [2024-07-15 20:32:52.291126] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:00.143 [2024-07-15 20:32:52.423627] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:00.402 [2024-07-15 20:32:52.525464] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:00.402 [2024-07-15 20:32:52.583796] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:00.402 [2024-07-15 20:32:52.583827] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:00.970 20:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:00.970 20:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:00.970 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:01.228 [2024-07-15 20:32:53.445730] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:01.228 [2024-07-15 20:32:53.445777] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:01.228 [2024-07-15 20:32:53.445788] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:01.228 [2024-07-15 20:32:53.445800] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:01.228 [2024-07-15 20:32:53.445809] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:01.228 [2024-07-15 20:32:53.445821] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:01.228 [2024-07-15 20:32:53.445830] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:01.228 [2024-07-15 20:32:53.445841] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.228 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:01.486 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.486 "name": "Existed_Raid", 00:19:01.486 "uuid": "06d77314-bb92-4e7f-944b-011df3b614af", 00:19:01.486 "strip_size_kb": 64, 00:19:01.486 "state": "configuring", 00:19:01.486 "raid_level": "raid0", 00:19:01.486 "superblock": true, 00:19:01.486 "num_base_bdevs": 4, 00:19:01.486 "num_base_bdevs_discovered": 0, 00:19:01.486 "num_base_bdevs_operational": 4, 00:19:01.486 "base_bdevs_list": [ 00:19:01.486 { 00:19:01.486 "name": "BaseBdev1", 00:19:01.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.486 "is_configured": false, 00:19:01.486 "data_offset": 0, 00:19:01.486 "data_size": 0 00:19:01.486 }, 00:19:01.486 { 00:19:01.486 "name": "BaseBdev2", 00:19:01.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.486 "is_configured": false, 00:19:01.486 "data_offset": 0, 00:19:01.486 "data_size": 0 00:19:01.486 }, 00:19:01.486 { 00:19:01.486 "name": "BaseBdev3", 00:19:01.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.486 "is_configured": false, 00:19:01.486 "data_offset": 0, 00:19:01.486 "data_size": 0 00:19:01.486 }, 00:19:01.486 { 00:19:01.486 "name": "BaseBdev4", 00:19:01.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:01.486 "is_configured": false, 00:19:01.486 "data_offset": 0, 00:19:01.487 "data_size": 0 00:19:01.487 } 00:19:01.487 ] 00:19:01.487 }' 00:19:01.487 20:32:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.487 20:32:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:02.053 20:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:02.311 [2024-07-15 20:32:54.552506] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:02.312 [2024-07-15 20:32:54.552541] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12e5aa0 name Existed_Raid, state configuring 00:19:02.312 20:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:02.570 [2024-07-15 20:32:54.733020] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:02.570 [2024-07-15 20:32:54.733051] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:02.570 [2024-07-15 20:32:54.733061] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:02.570 [2024-07-15 20:32:54.733073] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:02.570 [2024-07-15 20:32:54.733081] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:02.570 [2024-07-15 20:32:54.733092] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:02.570 [2024-07-15 20:32:54.733101] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:02.570 [2024-07-15 20:32:54.733112] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:02.570 20:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:02.570 [2024-07-15 20:32:54.915322] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:02.570 BaseBdev1 00:19:02.570 20:32:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:02.570 20:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:02.570 20:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:02.570 20:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:02.570 20:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:02.570 20:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:02.570 20:32:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:02.827 20:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:03.085 [ 00:19:03.085 { 00:19:03.085 "name": "BaseBdev1", 00:19:03.085 "aliases": [ 00:19:03.085 "f6f3128e-765a-471a-afb7-674adf0149d7" 00:19:03.085 ], 00:19:03.085 "product_name": "Malloc disk", 00:19:03.085 "block_size": 512, 00:19:03.085 "num_blocks": 65536, 00:19:03.085 "uuid": "f6f3128e-765a-471a-afb7-674adf0149d7", 00:19:03.085 "assigned_rate_limits": { 00:19:03.085 "rw_ios_per_sec": 0, 00:19:03.085 "rw_mbytes_per_sec": 0, 00:19:03.085 "r_mbytes_per_sec": 0, 00:19:03.085 "w_mbytes_per_sec": 0 00:19:03.085 }, 00:19:03.085 "claimed": true, 00:19:03.085 "claim_type": "exclusive_write", 00:19:03.085 "zoned": false, 00:19:03.085 "supported_io_types": { 00:19:03.085 "read": true, 00:19:03.085 "write": true, 00:19:03.085 "unmap": true, 00:19:03.085 "flush": true, 00:19:03.085 "reset": true, 00:19:03.085 "nvme_admin": false, 00:19:03.085 "nvme_io": false, 00:19:03.085 "nvme_io_md": false, 00:19:03.085 "write_zeroes": true, 00:19:03.085 "zcopy": true, 00:19:03.085 "get_zone_info": false, 00:19:03.085 "zone_management": false, 00:19:03.085 "zone_append": false, 00:19:03.085 "compare": false, 00:19:03.085 "compare_and_write": false, 00:19:03.085 "abort": true, 00:19:03.085 "seek_hole": false, 00:19:03.085 "seek_data": false, 00:19:03.085 "copy": true, 00:19:03.085 "nvme_iov_md": false 00:19:03.085 }, 00:19:03.085 "memory_domains": [ 00:19:03.085 { 00:19:03.085 "dma_device_id": "system", 00:19:03.085 "dma_device_type": 1 00:19:03.085 }, 00:19:03.085 { 00:19:03.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.085 "dma_device_type": 2 00:19:03.085 } 00:19:03.085 ], 00:19:03.085 "driver_specific": {} 00:19:03.085 } 00:19:03.085 ] 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:03.085 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.343 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.343 "name": "Existed_Raid", 00:19:03.343 "uuid": "f5197277-55f6-4b46-841e-e895a4750106", 00:19:03.343 "strip_size_kb": 64, 00:19:03.343 "state": "configuring", 00:19:03.343 "raid_level": "raid0", 00:19:03.343 "superblock": true, 00:19:03.343 "num_base_bdevs": 4, 00:19:03.343 "num_base_bdevs_discovered": 1, 00:19:03.343 "num_base_bdevs_operational": 4, 00:19:03.343 "base_bdevs_list": [ 00:19:03.343 { 00:19:03.343 "name": "BaseBdev1", 00:19:03.343 "uuid": "f6f3128e-765a-471a-afb7-674adf0149d7", 00:19:03.343 "is_configured": true, 00:19:03.343 "data_offset": 2048, 00:19:03.343 "data_size": 63488 00:19:03.343 }, 00:19:03.343 { 00:19:03.343 "name": "BaseBdev2", 00:19:03.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:03.343 "is_configured": false, 00:19:03.343 "data_offset": 0, 00:19:03.343 "data_size": 0 00:19:03.343 }, 00:19:03.343 { 00:19:03.343 "name": "BaseBdev3", 00:19:03.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:03.343 "is_configured": false, 00:19:03.343 "data_offset": 0, 00:19:03.343 "data_size": 0 00:19:03.343 }, 00:19:03.343 { 00:19:03.343 "name": "BaseBdev4", 00:19:03.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:03.343 "is_configured": false, 00:19:03.343 "data_offset": 0, 00:19:03.343 "data_size": 0 00:19:03.343 } 00:19:03.343 ] 00:19:03.343 }' 00:19:03.343 20:32:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.343 20:32:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:03.907 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:03.907 [2024-07-15 20:32:56.282956] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:03.907 [2024-07-15 20:32:56.282998] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12e5310 name Existed_Raid, state configuring 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:04.168 [2024-07-15 20:32:56.463485] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:04.168 [2024-07-15 20:32:56.464939] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:04.168 [2024-07-15 20:32:56.464975] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:04.168 [2024-07-15 20:32:56.464985] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:04.168 [2024-07-15 20:32:56.464996] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:04.168 [2024-07-15 20:32:56.465005] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:04.168 [2024-07-15 20:32:56.465017] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.168 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.439 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.439 "name": "Existed_Raid", 00:19:04.439 "uuid": "9d2680c6-f682-4f14-bafd-b48355d38e4d", 00:19:04.439 "strip_size_kb": 64, 00:19:04.439 "state": "configuring", 00:19:04.439 "raid_level": "raid0", 00:19:04.439 "superblock": true, 00:19:04.439 "num_base_bdevs": 4, 00:19:04.439 "num_base_bdevs_discovered": 1, 00:19:04.439 "num_base_bdevs_operational": 4, 00:19:04.439 "base_bdevs_list": [ 00:19:04.439 { 00:19:04.439 "name": "BaseBdev1", 00:19:04.439 "uuid": "f6f3128e-765a-471a-afb7-674adf0149d7", 00:19:04.439 "is_configured": true, 00:19:04.439 "data_offset": 2048, 00:19:04.439 "data_size": 63488 00:19:04.439 }, 00:19:04.439 { 00:19:04.439 "name": "BaseBdev2", 00:19:04.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.439 "is_configured": false, 00:19:04.439 "data_offset": 0, 00:19:04.439 "data_size": 0 00:19:04.439 }, 00:19:04.439 { 00:19:04.439 "name": "BaseBdev3", 00:19:04.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.439 "is_configured": false, 00:19:04.439 "data_offset": 0, 00:19:04.439 "data_size": 0 00:19:04.439 }, 00:19:04.439 { 00:19:04.439 "name": "BaseBdev4", 00:19:04.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.439 "is_configured": false, 00:19:04.439 "data_offset": 0, 00:19:04.439 "data_size": 0 00:19:04.439 } 00:19:04.439 ] 00:19:04.439 }' 00:19:04.439 20:32:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.439 20:32:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:05.005 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:05.263 [2024-07-15 20:32:57.425499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:05.263 BaseBdev2 00:19:05.264 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:05.264 20:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:05.264 20:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:05.264 20:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:05.264 20:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:05.264 20:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:05.264 20:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:05.521 20:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:05.780 [ 00:19:05.780 { 00:19:05.780 "name": "BaseBdev2", 00:19:05.780 "aliases": [ 00:19:05.780 "ab97e35a-dcd4-483c-84b1-762500e89ddd" 00:19:05.780 ], 00:19:05.780 "product_name": "Malloc disk", 00:19:05.780 "block_size": 512, 00:19:05.780 "num_blocks": 65536, 00:19:05.780 "uuid": "ab97e35a-dcd4-483c-84b1-762500e89ddd", 00:19:05.780 "assigned_rate_limits": { 00:19:05.780 "rw_ios_per_sec": 0, 00:19:05.780 "rw_mbytes_per_sec": 0, 00:19:05.780 "r_mbytes_per_sec": 0, 00:19:05.780 "w_mbytes_per_sec": 0 00:19:05.780 }, 00:19:05.780 "claimed": true, 00:19:05.780 "claim_type": "exclusive_write", 00:19:05.780 "zoned": false, 00:19:05.780 "supported_io_types": { 00:19:05.780 "read": true, 00:19:05.780 "write": true, 00:19:05.780 "unmap": true, 00:19:05.780 "flush": true, 00:19:05.780 "reset": true, 00:19:05.780 "nvme_admin": false, 00:19:05.780 "nvme_io": false, 00:19:05.780 "nvme_io_md": false, 00:19:05.780 "write_zeroes": true, 00:19:05.780 "zcopy": true, 00:19:05.780 "get_zone_info": false, 00:19:05.780 "zone_management": false, 00:19:05.780 "zone_append": false, 00:19:05.780 "compare": false, 00:19:05.780 "compare_and_write": false, 00:19:05.780 "abort": true, 00:19:05.780 "seek_hole": false, 00:19:05.780 "seek_data": false, 00:19:05.780 "copy": true, 00:19:05.780 "nvme_iov_md": false 00:19:05.780 }, 00:19:05.780 "memory_domains": [ 00:19:05.780 { 00:19:05.780 "dma_device_id": "system", 00:19:05.780 "dma_device_type": 1 00:19:05.780 }, 00:19:05.780 { 00:19:05.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.780 "dma_device_type": 2 00:19:05.780 } 00:19:05.780 ], 00:19:05.780 "driver_specific": {} 00:19:05.780 } 00:19:05.780 ] 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.780 20:32:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:06.039 20:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.039 "name": "Existed_Raid", 00:19:06.039 "uuid": "9d2680c6-f682-4f14-bafd-b48355d38e4d", 00:19:06.039 "strip_size_kb": 64, 00:19:06.039 "state": "configuring", 00:19:06.039 "raid_level": "raid0", 00:19:06.039 "superblock": true, 00:19:06.039 "num_base_bdevs": 4, 00:19:06.039 "num_base_bdevs_discovered": 2, 00:19:06.039 "num_base_bdevs_operational": 4, 00:19:06.039 "base_bdevs_list": [ 00:19:06.039 { 00:19:06.039 "name": "BaseBdev1", 00:19:06.039 "uuid": "f6f3128e-765a-471a-afb7-674adf0149d7", 00:19:06.039 "is_configured": true, 00:19:06.039 "data_offset": 2048, 00:19:06.039 "data_size": 63488 00:19:06.039 }, 00:19:06.039 { 00:19:06.039 "name": "BaseBdev2", 00:19:06.039 "uuid": "ab97e35a-dcd4-483c-84b1-762500e89ddd", 00:19:06.039 "is_configured": true, 00:19:06.039 "data_offset": 2048, 00:19:06.039 "data_size": 63488 00:19:06.039 }, 00:19:06.039 { 00:19:06.039 "name": "BaseBdev3", 00:19:06.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.039 "is_configured": false, 00:19:06.039 "data_offset": 0, 00:19:06.039 "data_size": 0 00:19:06.039 }, 00:19:06.039 { 00:19:06.039 "name": "BaseBdev4", 00:19:06.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.039 "is_configured": false, 00:19:06.039 "data_offset": 0, 00:19:06.039 "data_size": 0 00:19:06.039 } 00:19:06.039 ] 00:19:06.039 }' 00:19:06.039 20:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.039 20:32:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:06.607 20:32:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:06.865 [2024-07-15 20:32:59.033243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:06.865 BaseBdev3 00:19:06.865 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:06.865 20:32:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:06.865 20:32:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:06.865 20:32:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:06.865 20:32:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:06.865 20:32:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:06.865 20:32:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:07.124 20:32:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:07.383 [ 00:19:07.383 { 00:19:07.383 "name": "BaseBdev3", 00:19:07.383 "aliases": [ 00:19:07.383 "693853c6-d266-45f2-81a6-ea3d63811839" 00:19:07.383 ], 00:19:07.383 "product_name": "Malloc disk", 00:19:07.383 "block_size": 512, 00:19:07.383 "num_blocks": 65536, 00:19:07.383 "uuid": "693853c6-d266-45f2-81a6-ea3d63811839", 00:19:07.383 "assigned_rate_limits": { 00:19:07.383 "rw_ios_per_sec": 0, 00:19:07.383 "rw_mbytes_per_sec": 0, 00:19:07.383 "r_mbytes_per_sec": 0, 00:19:07.383 "w_mbytes_per_sec": 0 00:19:07.383 }, 00:19:07.383 "claimed": true, 00:19:07.383 "claim_type": "exclusive_write", 00:19:07.383 "zoned": false, 00:19:07.383 "supported_io_types": { 00:19:07.383 "read": true, 00:19:07.383 "write": true, 00:19:07.383 "unmap": true, 00:19:07.383 "flush": true, 00:19:07.383 "reset": true, 00:19:07.383 "nvme_admin": false, 00:19:07.383 "nvme_io": false, 00:19:07.383 "nvme_io_md": false, 00:19:07.383 "write_zeroes": true, 00:19:07.383 "zcopy": true, 00:19:07.383 "get_zone_info": false, 00:19:07.383 "zone_management": false, 00:19:07.383 "zone_append": false, 00:19:07.383 "compare": false, 00:19:07.383 "compare_and_write": false, 00:19:07.383 "abort": true, 00:19:07.383 "seek_hole": false, 00:19:07.383 "seek_data": false, 00:19:07.383 "copy": true, 00:19:07.383 "nvme_iov_md": false 00:19:07.383 }, 00:19:07.383 "memory_domains": [ 00:19:07.383 { 00:19:07.383 "dma_device_id": "system", 00:19:07.383 "dma_device_type": 1 00:19:07.383 }, 00:19:07.383 { 00:19:07.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.383 "dma_device_type": 2 00:19:07.383 } 00:19:07.383 ], 00:19:07.383 "driver_specific": {} 00:19:07.383 } 00:19:07.383 ] 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.383 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:07.642 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:07.642 "name": "Existed_Raid", 00:19:07.642 "uuid": "9d2680c6-f682-4f14-bafd-b48355d38e4d", 00:19:07.642 "strip_size_kb": 64, 00:19:07.642 "state": "configuring", 00:19:07.642 "raid_level": "raid0", 00:19:07.642 "superblock": true, 00:19:07.642 "num_base_bdevs": 4, 00:19:07.642 "num_base_bdevs_discovered": 3, 00:19:07.642 "num_base_bdevs_operational": 4, 00:19:07.642 "base_bdevs_list": [ 00:19:07.642 { 00:19:07.642 "name": "BaseBdev1", 00:19:07.642 "uuid": "f6f3128e-765a-471a-afb7-674adf0149d7", 00:19:07.642 "is_configured": true, 00:19:07.642 "data_offset": 2048, 00:19:07.642 "data_size": 63488 00:19:07.642 }, 00:19:07.642 { 00:19:07.642 "name": "BaseBdev2", 00:19:07.642 "uuid": "ab97e35a-dcd4-483c-84b1-762500e89ddd", 00:19:07.642 "is_configured": true, 00:19:07.642 "data_offset": 2048, 00:19:07.642 "data_size": 63488 00:19:07.642 }, 00:19:07.642 { 00:19:07.642 "name": "BaseBdev3", 00:19:07.642 "uuid": "693853c6-d266-45f2-81a6-ea3d63811839", 00:19:07.642 "is_configured": true, 00:19:07.642 "data_offset": 2048, 00:19:07.642 "data_size": 63488 00:19:07.642 }, 00:19:07.642 { 00:19:07.642 "name": "BaseBdev4", 00:19:07.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.642 "is_configured": false, 00:19:07.642 "data_offset": 0, 00:19:07.642 "data_size": 0 00:19:07.642 } 00:19:07.642 ] 00:19:07.642 }' 00:19:07.642 20:32:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:07.642 20:32:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:08.211 20:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:08.470 [2024-07-15 20:33:00.640973] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:08.470 [2024-07-15 20:33:00.641151] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12e6350 00:19:08.470 [2024-07-15 20:33:00.641166] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:08.470 [2024-07-15 20:33:00.641350] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12e6020 00:19:08.470 [2024-07-15 20:33:00.641468] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12e6350 00:19:08.470 [2024-07-15 20:33:00.641478] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12e6350 00:19:08.470 [2024-07-15 20:33:00.641567] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:08.470 BaseBdev4 00:19:08.470 20:33:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:08.470 20:33:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:08.470 20:33:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:08.470 20:33:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:08.470 20:33:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:08.470 20:33:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:08.470 20:33:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:08.729 20:33:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:08.988 [ 00:19:08.988 { 00:19:08.988 "name": "BaseBdev4", 00:19:08.988 "aliases": [ 00:19:08.988 "b0cda2c1-cfbd-4b97-9c66-8a42f71e9019" 00:19:08.988 ], 00:19:08.988 "product_name": "Malloc disk", 00:19:08.988 "block_size": 512, 00:19:08.988 "num_blocks": 65536, 00:19:08.988 "uuid": "b0cda2c1-cfbd-4b97-9c66-8a42f71e9019", 00:19:08.988 "assigned_rate_limits": { 00:19:08.988 "rw_ios_per_sec": 0, 00:19:08.988 "rw_mbytes_per_sec": 0, 00:19:08.988 "r_mbytes_per_sec": 0, 00:19:08.988 "w_mbytes_per_sec": 0 00:19:08.988 }, 00:19:08.988 "claimed": true, 00:19:08.988 "claim_type": "exclusive_write", 00:19:08.988 "zoned": false, 00:19:08.988 "supported_io_types": { 00:19:08.988 "read": true, 00:19:08.988 "write": true, 00:19:08.988 "unmap": true, 00:19:08.988 "flush": true, 00:19:08.988 "reset": true, 00:19:08.988 "nvme_admin": false, 00:19:08.988 "nvme_io": false, 00:19:08.988 "nvme_io_md": false, 00:19:08.988 "write_zeroes": true, 00:19:08.988 "zcopy": true, 00:19:08.988 "get_zone_info": false, 00:19:08.988 "zone_management": false, 00:19:08.988 "zone_append": false, 00:19:08.988 "compare": false, 00:19:08.988 "compare_and_write": false, 00:19:08.988 "abort": true, 00:19:08.988 "seek_hole": false, 00:19:08.988 "seek_data": false, 00:19:08.988 "copy": true, 00:19:08.988 "nvme_iov_md": false 00:19:08.988 }, 00:19:08.988 "memory_domains": [ 00:19:08.988 { 00:19:08.988 "dma_device_id": "system", 00:19:08.988 "dma_device_type": 1 00:19:08.988 }, 00:19:08.988 { 00:19:08.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.988 "dma_device_type": 2 00:19:08.988 } 00:19:08.988 ], 00:19:08.988 "driver_specific": {} 00:19:08.988 } 00:19:08.988 ] 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:08.988 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.988 "name": "Existed_Raid", 00:19:08.988 "uuid": "9d2680c6-f682-4f14-bafd-b48355d38e4d", 00:19:08.989 "strip_size_kb": 64, 00:19:08.989 "state": "online", 00:19:08.989 "raid_level": "raid0", 00:19:08.989 "superblock": true, 00:19:08.989 "num_base_bdevs": 4, 00:19:08.989 "num_base_bdevs_discovered": 4, 00:19:08.989 "num_base_bdevs_operational": 4, 00:19:08.989 "base_bdevs_list": [ 00:19:08.989 { 00:19:08.989 "name": "BaseBdev1", 00:19:08.989 "uuid": "f6f3128e-765a-471a-afb7-674adf0149d7", 00:19:08.989 "is_configured": true, 00:19:08.989 "data_offset": 2048, 00:19:08.989 "data_size": 63488 00:19:08.989 }, 00:19:08.989 { 00:19:08.989 "name": "BaseBdev2", 00:19:08.989 "uuid": "ab97e35a-dcd4-483c-84b1-762500e89ddd", 00:19:08.989 "is_configured": true, 00:19:08.989 "data_offset": 2048, 00:19:08.989 "data_size": 63488 00:19:08.989 }, 00:19:08.989 { 00:19:08.989 "name": "BaseBdev3", 00:19:08.989 "uuid": "693853c6-d266-45f2-81a6-ea3d63811839", 00:19:08.989 "is_configured": true, 00:19:08.989 "data_offset": 2048, 00:19:08.989 "data_size": 63488 00:19:08.989 }, 00:19:08.989 { 00:19:08.989 "name": "BaseBdev4", 00:19:08.989 "uuid": "b0cda2c1-cfbd-4b97-9c66-8a42f71e9019", 00:19:08.989 "is_configured": true, 00:19:08.989 "data_offset": 2048, 00:19:08.989 "data_size": 63488 00:19:08.989 } 00:19:08.989 ] 00:19:08.989 }' 00:19:08.989 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.989 20:33:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:09.934 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:09.934 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:09.934 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:09.934 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:09.934 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:09.934 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:09.934 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:09.934 20:33:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:09.934 [2024-07-15 20:33:02.201437] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:09.934 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:09.934 "name": "Existed_Raid", 00:19:09.934 "aliases": [ 00:19:09.934 "9d2680c6-f682-4f14-bafd-b48355d38e4d" 00:19:09.934 ], 00:19:09.934 "product_name": "Raid Volume", 00:19:09.934 "block_size": 512, 00:19:09.934 "num_blocks": 253952, 00:19:09.934 "uuid": "9d2680c6-f682-4f14-bafd-b48355d38e4d", 00:19:09.934 "assigned_rate_limits": { 00:19:09.934 "rw_ios_per_sec": 0, 00:19:09.934 "rw_mbytes_per_sec": 0, 00:19:09.934 "r_mbytes_per_sec": 0, 00:19:09.934 "w_mbytes_per_sec": 0 00:19:09.934 }, 00:19:09.934 "claimed": false, 00:19:09.934 "zoned": false, 00:19:09.934 "supported_io_types": { 00:19:09.934 "read": true, 00:19:09.934 "write": true, 00:19:09.934 "unmap": true, 00:19:09.934 "flush": true, 00:19:09.934 "reset": true, 00:19:09.934 "nvme_admin": false, 00:19:09.934 "nvme_io": false, 00:19:09.934 "nvme_io_md": false, 00:19:09.934 "write_zeroes": true, 00:19:09.934 "zcopy": false, 00:19:09.934 "get_zone_info": false, 00:19:09.934 "zone_management": false, 00:19:09.934 "zone_append": false, 00:19:09.934 "compare": false, 00:19:09.934 "compare_and_write": false, 00:19:09.934 "abort": false, 00:19:09.934 "seek_hole": false, 00:19:09.934 "seek_data": false, 00:19:09.934 "copy": false, 00:19:09.934 "nvme_iov_md": false 00:19:09.934 }, 00:19:09.934 "memory_domains": [ 00:19:09.934 { 00:19:09.934 "dma_device_id": "system", 00:19:09.934 "dma_device_type": 1 00:19:09.934 }, 00:19:09.934 { 00:19:09.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.934 "dma_device_type": 2 00:19:09.934 }, 00:19:09.934 { 00:19:09.934 "dma_device_id": "system", 00:19:09.934 "dma_device_type": 1 00:19:09.934 }, 00:19:09.934 { 00:19:09.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.934 "dma_device_type": 2 00:19:09.934 }, 00:19:09.934 { 00:19:09.934 "dma_device_id": "system", 00:19:09.934 "dma_device_type": 1 00:19:09.934 }, 00:19:09.934 { 00:19:09.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.934 "dma_device_type": 2 00:19:09.934 }, 00:19:09.934 { 00:19:09.934 "dma_device_id": "system", 00:19:09.934 "dma_device_type": 1 00:19:09.934 }, 00:19:09.934 { 00:19:09.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.934 "dma_device_type": 2 00:19:09.934 } 00:19:09.934 ], 00:19:09.934 "driver_specific": { 00:19:09.934 "raid": { 00:19:09.934 "uuid": "9d2680c6-f682-4f14-bafd-b48355d38e4d", 00:19:09.934 "strip_size_kb": 64, 00:19:09.934 "state": "online", 00:19:09.934 "raid_level": "raid0", 00:19:09.934 "superblock": true, 00:19:09.934 "num_base_bdevs": 4, 00:19:09.934 "num_base_bdevs_discovered": 4, 00:19:09.934 "num_base_bdevs_operational": 4, 00:19:09.934 "base_bdevs_list": [ 00:19:09.934 { 00:19:09.934 "name": "BaseBdev1", 00:19:09.934 "uuid": "f6f3128e-765a-471a-afb7-674adf0149d7", 00:19:09.934 "is_configured": true, 00:19:09.934 "data_offset": 2048, 00:19:09.934 "data_size": 63488 00:19:09.934 }, 00:19:09.934 { 00:19:09.934 "name": "BaseBdev2", 00:19:09.934 "uuid": "ab97e35a-dcd4-483c-84b1-762500e89ddd", 00:19:09.934 "is_configured": true, 00:19:09.934 "data_offset": 2048, 00:19:09.934 "data_size": 63488 00:19:09.934 }, 00:19:09.934 { 00:19:09.934 "name": "BaseBdev3", 00:19:09.934 "uuid": "693853c6-d266-45f2-81a6-ea3d63811839", 00:19:09.934 "is_configured": true, 00:19:09.934 "data_offset": 2048, 00:19:09.934 "data_size": 63488 00:19:09.934 }, 00:19:09.934 { 00:19:09.934 "name": "BaseBdev4", 00:19:09.934 "uuid": "b0cda2c1-cfbd-4b97-9c66-8a42f71e9019", 00:19:09.934 "is_configured": true, 00:19:09.934 "data_offset": 2048, 00:19:09.934 "data_size": 63488 00:19:09.934 } 00:19:09.934 ] 00:19:09.934 } 00:19:09.934 } 00:19:09.934 }' 00:19:09.934 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:09.934 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:09.934 BaseBdev2 00:19:09.934 BaseBdev3 00:19:09.934 BaseBdev4' 00:19:09.934 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:09.934 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:09.934 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:10.196 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:10.196 "name": "BaseBdev1", 00:19:10.196 "aliases": [ 00:19:10.196 "f6f3128e-765a-471a-afb7-674adf0149d7" 00:19:10.196 ], 00:19:10.196 "product_name": "Malloc disk", 00:19:10.196 "block_size": 512, 00:19:10.196 "num_blocks": 65536, 00:19:10.196 "uuid": "f6f3128e-765a-471a-afb7-674adf0149d7", 00:19:10.196 "assigned_rate_limits": { 00:19:10.196 "rw_ios_per_sec": 0, 00:19:10.196 "rw_mbytes_per_sec": 0, 00:19:10.196 "r_mbytes_per_sec": 0, 00:19:10.196 "w_mbytes_per_sec": 0 00:19:10.196 }, 00:19:10.196 "claimed": true, 00:19:10.196 "claim_type": "exclusive_write", 00:19:10.196 "zoned": false, 00:19:10.196 "supported_io_types": { 00:19:10.196 "read": true, 00:19:10.196 "write": true, 00:19:10.196 "unmap": true, 00:19:10.196 "flush": true, 00:19:10.196 "reset": true, 00:19:10.196 "nvme_admin": false, 00:19:10.196 "nvme_io": false, 00:19:10.196 "nvme_io_md": false, 00:19:10.196 "write_zeroes": true, 00:19:10.196 "zcopy": true, 00:19:10.196 "get_zone_info": false, 00:19:10.196 "zone_management": false, 00:19:10.196 "zone_append": false, 00:19:10.196 "compare": false, 00:19:10.196 "compare_and_write": false, 00:19:10.196 "abort": true, 00:19:10.196 "seek_hole": false, 00:19:10.196 "seek_data": false, 00:19:10.196 "copy": true, 00:19:10.196 "nvme_iov_md": false 00:19:10.196 }, 00:19:10.196 "memory_domains": [ 00:19:10.196 { 00:19:10.196 "dma_device_id": "system", 00:19:10.196 "dma_device_type": 1 00:19:10.196 }, 00:19:10.196 { 00:19:10.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.196 "dma_device_type": 2 00:19:10.196 } 00:19:10.196 ], 00:19:10.196 "driver_specific": {} 00:19:10.196 }' 00:19:10.196 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.196 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.454 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:10.454 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.454 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.454 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:10.454 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.454 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.454 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:10.454 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.713 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.713 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:10.713 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:10.713 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:10.713 20:33:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:10.972 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:10.972 "name": "BaseBdev2", 00:19:10.972 "aliases": [ 00:19:10.972 "ab97e35a-dcd4-483c-84b1-762500e89ddd" 00:19:10.972 ], 00:19:10.972 "product_name": "Malloc disk", 00:19:10.972 "block_size": 512, 00:19:10.972 "num_blocks": 65536, 00:19:10.972 "uuid": "ab97e35a-dcd4-483c-84b1-762500e89ddd", 00:19:10.972 "assigned_rate_limits": { 00:19:10.972 "rw_ios_per_sec": 0, 00:19:10.972 "rw_mbytes_per_sec": 0, 00:19:10.972 "r_mbytes_per_sec": 0, 00:19:10.972 "w_mbytes_per_sec": 0 00:19:10.972 }, 00:19:10.972 "claimed": true, 00:19:10.972 "claim_type": "exclusive_write", 00:19:10.972 "zoned": false, 00:19:10.972 "supported_io_types": { 00:19:10.972 "read": true, 00:19:10.972 "write": true, 00:19:10.972 "unmap": true, 00:19:10.972 "flush": true, 00:19:10.972 "reset": true, 00:19:10.972 "nvme_admin": false, 00:19:10.972 "nvme_io": false, 00:19:10.972 "nvme_io_md": false, 00:19:10.972 "write_zeroes": true, 00:19:10.972 "zcopy": true, 00:19:10.972 "get_zone_info": false, 00:19:10.972 "zone_management": false, 00:19:10.972 "zone_append": false, 00:19:10.972 "compare": false, 00:19:10.972 "compare_and_write": false, 00:19:10.972 "abort": true, 00:19:10.972 "seek_hole": false, 00:19:10.972 "seek_data": false, 00:19:10.972 "copy": true, 00:19:10.972 "nvme_iov_md": false 00:19:10.972 }, 00:19:10.972 "memory_domains": [ 00:19:10.972 { 00:19:10.972 "dma_device_id": "system", 00:19:10.972 "dma_device_type": 1 00:19:10.972 }, 00:19:10.972 { 00:19:10.972 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.972 "dma_device_type": 2 00:19:10.972 } 00:19:10.972 ], 00:19:10.972 "driver_specific": {} 00:19:10.972 }' 00:19:10.972 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.972 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.972 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:10.972 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.972 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.972 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:10.972 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.232 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.232 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:11.232 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.232 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.232 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:11.232 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:11.232 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:11.232 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:11.491 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:11.491 "name": "BaseBdev3", 00:19:11.491 "aliases": [ 00:19:11.491 "693853c6-d266-45f2-81a6-ea3d63811839" 00:19:11.491 ], 00:19:11.491 "product_name": "Malloc disk", 00:19:11.491 "block_size": 512, 00:19:11.491 "num_blocks": 65536, 00:19:11.491 "uuid": "693853c6-d266-45f2-81a6-ea3d63811839", 00:19:11.491 "assigned_rate_limits": { 00:19:11.491 "rw_ios_per_sec": 0, 00:19:11.491 "rw_mbytes_per_sec": 0, 00:19:11.491 "r_mbytes_per_sec": 0, 00:19:11.491 "w_mbytes_per_sec": 0 00:19:11.491 }, 00:19:11.491 "claimed": true, 00:19:11.491 "claim_type": "exclusive_write", 00:19:11.491 "zoned": false, 00:19:11.491 "supported_io_types": { 00:19:11.491 "read": true, 00:19:11.491 "write": true, 00:19:11.491 "unmap": true, 00:19:11.491 "flush": true, 00:19:11.491 "reset": true, 00:19:11.491 "nvme_admin": false, 00:19:11.491 "nvme_io": false, 00:19:11.491 "nvme_io_md": false, 00:19:11.491 "write_zeroes": true, 00:19:11.491 "zcopy": true, 00:19:11.491 "get_zone_info": false, 00:19:11.491 "zone_management": false, 00:19:11.491 "zone_append": false, 00:19:11.491 "compare": false, 00:19:11.491 "compare_and_write": false, 00:19:11.491 "abort": true, 00:19:11.491 "seek_hole": false, 00:19:11.491 "seek_data": false, 00:19:11.491 "copy": true, 00:19:11.491 "nvme_iov_md": false 00:19:11.491 }, 00:19:11.491 "memory_domains": [ 00:19:11.491 { 00:19:11.491 "dma_device_id": "system", 00:19:11.491 "dma_device_type": 1 00:19:11.491 }, 00:19:11.491 { 00:19:11.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.491 "dma_device_type": 2 00:19:11.491 } 00:19:11.491 ], 00:19:11.491 "driver_specific": {} 00:19:11.491 }' 00:19:11.491 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.491 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.491 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:11.491 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.750 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.750 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:11.750 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.750 20:33:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.750 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:11.750 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.750 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.750 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:11.750 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:11.750 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:11.750 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:12.010 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:12.010 "name": "BaseBdev4", 00:19:12.010 "aliases": [ 00:19:12.010 "b0cda2c1-cfbd-4b97-9c66-8a42f71e9019" 00:19:12.010 ], 00:19:12.010 "product_name": "Malloc disk", 00:19:12.010 "block_size": 512, 00:19:12.010 "num_blocks": 65536, 00:19:12.010 "uuid": "b0cda2c1-cfbd-4b97-9c66-8a42f71e9019", 00:19:12.010 "assigned_rate_limits": { 00:19:12.010 "rw_ios_per_sec": 0, 00:19:12.010 "rw_mbytes_per_sec": 0, 00:19:12.010 "r_mbytes_per_sec": 0, 00:19:12.010 "w_mbytes_per_sec": 0 00:19:12.010 }, 00:19:12.010 "claimed": true, 00:19:12.010 "claim_type": "exclusive_write", 00:19:12.010 "zoned": false, 00:19:12.010 "supported_io_types": { 00:19:12.010 "read": true, 00:19:12.010 "write": true, 00:19:12.010 "unmap": true, 00:19:12.010 "flush": true, 00:19:12.010 "reset": true, 00:19:12.010 "nvme_admin": false, 00:19:12.010 "nvme_io": false, 00:19:12.010 "nvme_io_md": false, 00:19:12.010 "write_zeroes": true, 00:19:12.010 "zcopy": true, 00:19:12.010 "get_zone_info": false, 00:19:12.010 "zone_management": false, 00:19:12.010 "zone_append": false, 00:19:12.010 "compare": false, 00:19:12.010 "compare_and_write": false, 00:19:12.010 "abort": true, 00:19:12.010 "seek_hole": false, 00:19:12.010 "seek_data": false, 00:19:12.010 "copy": true, 00:19:12.010 "nvme_iov_md": false 00:19:12.010 }, 00:19:12.010 "memory_domains": [ 00:19:12.010 { 00:19:12.010 "dma_device_id": "system", 00:19:12.010 "dma_device_type": 1 00:19:12.010 }, 00:19:12.010 { 00:19:12.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.010 "dma_device_type": 2 00:19:12.010 } 00:19:12.010 ], 00:19:12.010 "driver_specific": {} 00:19:12.010 }' 00:19:12.010 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.267 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.267 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:12.267 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.267 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.267 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:12.267 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.267 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.267 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.267 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:12.527 [2024-07-15 20:33:04.852309] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:12.527 [2024-07-15 20:33:04.852339] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:12.527 [2024-07-15 20:33:04.852388] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.527 20:33:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:12.786 20:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:12.786 "name": "Existed_Raid", 00:19:12.786 "uuid": "9d2680c6-f682-4f14-bafd-b48355d38e4d", 00:19:12.786 "strip_size_kb": 64, 00:19:12.786 "state": "offline", 00:19:12.786 "raid_level": "raid0", 00:19:12.786 "superblock": true, 00:19:12.786 "num_base_bdevs": 4, 00:19:12.786 "num_base_bdevs_discovered": 3, 00:19:12.786 "num_base_bdevs_operational": 3, 00:19:12.786 "base_bdevs_list": [ 00:19:12.786 { 00:19:12.786 "name": null, 00:19:12.786 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:12.786 "is_configured": false, 00:19:12.786 "data_offset": 2048, 00:19:12.786 "data_size": 63488 00:19:12.786 }, 00:19:12.786 { 00:19:12.786 "name": "BaseBdev2", 00:19:12.786 "uuid": "ab97e35a-dcd4-483c-84b1-762500e89ddd", 00:19:12.786 "is_configured": true, 00:19:12.786 "data_offset": 2048, 00:19:12.786 "data_size": 63488 00:19:12.786 }, 00:19:12.786 { 00:19:12.786 "name": "BaseBdev3", 00:19:12.786 "uuid": "693853c6-d266-45f2-81a6-ea3d63811839", 00:19:12.786 "is_configured": true, 00:19:12.786 "data_offset": 2048, 00:19:12.786 "data_size": 63488 00:19:12.786 }, 00:19:12.786 { 00:19:12.786 "name": "BaseBdev4", 00:19:12.786 "uuid": "b0cda2c1-cfbd-4b97-9c66-8a42f71e9019", 00:19:12.786 "is_configured": true, 00:19:12.786 "data_offset": 2048, 00:19:12.786 "data_size": 63488 00:19:12.786 } 00:19:12.786 ] 00:19:12.786 }' 00:19:12.786 20:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:12.786 20:33:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:13.353 20:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:13.353 20:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:13.612 20:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.612 20:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:13.612 20:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:13.612 20:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:13.612 20:33:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:13.870 [2024-07-15 20:33:06.205020] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:13.870 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:13.870 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:13.870 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.870 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:14.129 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:14.129 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:14.129 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:14.387 [2024-07-15 20:33:06.688924] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:14.387 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:14.387 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:14.387 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.387 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:14.646 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:14.646 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:14.646 20:33:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:14.905 [2024-07-15 20:33:07.184797] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:14.905 [2024-07-15 20:33:07.184846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12e6350 name Existed_Raid, state offline 00:19:14.905 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:14.905 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:14.905 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.905 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:15.164 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:15.164 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:15.164 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:15.164 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:15.164 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:15.164 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:15.422 BaseBdev2 00:19:15.422 20:33:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:15.422 20:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:15.422 20:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:15.422 20:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:15.422 20:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:15.422 20:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:15.422 20:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:15.681 20:33:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:15.940 [ 00:19:15.940 { 00:19:15.940 "name": "BaseBdev2", 00:19:15.940 "aliases": [ 00:19:15.940 "60400b26-7daa-4a46-8edf-48fb53276e03" 00:19:15.940 ], 00:19:15.940 "product_name": "Malloc disk", 00:19:15.940 "block_size": 512, 00:19:15.940 "num_blocks": 65536, 00:19:15.940 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:15.940 "assigned_rate_limits": { 00:19:15.940 "rw_ios_per_sec": 0, 00:19:15.940 "rw_mbytes_per_sec": 0, 00:19:15.940 "r_mbytes_per_sec": 0, 00:19:15.940 "w_mbytes_per_sec": 0 00:19:15.940 }, 00:19:15.940 "claimed": false, 00:19:15.940 "zoned": false, 00:19:15.940 "supported_io_types": { 00:19:15.940 "read": true, 00:19:15.940 "write": true, 00:19:15.940 "unmap": true, 00:19:15.940 "flush": true, 00:19:15.940 "reset": true, 00:19:15.940 "nvme_admin": false, 00:19:15.940 "nvme_io": false, 00:19:15.940 "nvme_io_md": false, 00:19:15.940 "write_zeroes": true, 00:19:15.940 "zcopy": true, 00:19:15.940 "get_zone_info": false, 00:19:15.940 "zone_management": false, 00:19:15.940 "zone_append": false, 00:19:15.940 "compare": false, 00:19:15.940 "compare_and_write": false, 00:19:15.940 "abort": true, 00:19:15.940 "seek_hole": false, 00:19:15.940 "seek_data": false, 00:19:15.940 "copy": true, 00:19:15.940 "nvme_iov_md": false 00:19:15.940 }, 00:19:15.940 "memory_domains": [ 00:19:15.940 { 00:19:15.940 "dma_device_id": "system", 00:19:15.940 "dma_device_type": 1 00:19:15.940 }, 00:19:15.940 { 00:19:15.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.940 "dma_device_type": 2 00:19:15.940 } 00:19:15.940 ], 00:19:15.940 "driver_specific": {} 00:19:15.940 } 00:19:15.940 ] 00:19:15.940 20:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:15.940 20:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:15.940 20:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:15.940 20:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:16.199 BaseBdev3 00:19:16.199 20:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:16.199 20:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:16.199 20:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:16.199 20:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:16.199 20:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:16.199 20:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:16.199 20:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:16.457 20:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:16.715 [ 00:19:16.715 { 00:19:16.715 "name": "BaseBdev3", 00:19:16.715 "aliases": [ 00:19:16.716 "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3" 00:19:16.716 ], 00:19:16.716 "product_name": "Malloc disk", 00:19:16.716 "block_size": 512, 00:19:16.716 "num_blocks": 65536, 00:19:16.716 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:16.716 "assigned_rate_limits": { 00:19:16.716 "rw_ios_per_sec": 0, 00:19:16.716 "rw_mbytes_per_sec": 0, 00:19:16.716 "r_mbytes_per_sec": 0, 00:19:16.716 "w_mbytes_per_sec": 0 00:19:16.716 }, 00:19:16.716 "claimed": false, 00:19:16.716 "zoned": false, 00:19:16.716 "supported_io_types": { 00:19:16.716 "read": true, 00:19:16.716 "write": true, 00:19:16.716 "unmap": true, 00:19:16.716 "flush": true, 00:19:16.716 "reset": true, 00:19:16.716 "nvme_admin": false, 00:19:16.716 "nvme_io": false, 00:19:16.716 "nvme_io_md": false, 00:19:16.716 "write_zeroes": true, 00:19:16.716 "zcopy": true, 00:19:16.716 "get_zone_info": false, 00:19:16.716 "zone_management": false, 00:19:16.716 "zone_append": false, 00:19:16.716 "compare": false, 00:19:16.716 "compare_and_write": false, 00:19:16.716 "abort": true, 00:19:16.716 "seek_hole": false, 00:19:16.716 "seek_data": false, 00:19:16.716 "copy": true, 00:19:16.716 "nvme_iov_md": false 00:19:16.716 }, 00:19:16.716 "memory_domains": [ 00:19:16.716 { 00:19:16.716 "dma_device_id": "system", 00:19:16.716 "dma_device_type": 1 00:19:16.716 }, 00:19:16.716 { 00:19:16.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.716 "dma_device_type": 2 00:19:16.716 } 00:19:16.716 ], 00:19:16.716 "driver_specific": {} 00:19:16.716 } 00:19:16.716 ] 00:19:16.716 20:33:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:16.716 20:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:16.716 20:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:16.716 20:33:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:16.975 BaseBdev4 00:19:16.975 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:16.975 20:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:16.975 20:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:16.975 20:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:16.975 20:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:16.975 20:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:16.975 20:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:17.234 20:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:17.493 [ 00:19:17.493 { 00:19:17.493 "name": "BaseBdev4", 00:19:17.493 "aliases": [ 00:19:17.493 "e14df952-cb92-48e8-bd85-b4d377ed96a3" 00:19:17.493 ], 00:19:17.493 "product_name": "Malloc disk", 00:19:17.493 "block_size": 512, 00:19:17.493 "num_blocks": 65536, 00:19:17.493 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:17.493 "assigned_rate_limits": { 00:19:17.493 "rw_ios_per_sec": 0, 00:19:17.493 "rw_mbytes_per_sec": 0, 00:19:17.493 "r_mbytes_per_sec": 0, 00:19:17.493 "w_mbytes_per_sec": 0 00:19:17.493 }, 00:19:17.493 "claimed": false, 00:19:17.493 "zoned": false, 00:19:17.493 "supported_io_types": { 00:19:17.493 "read": true, 00:19:17.493 "write": true, 00:19:17.493 "unmap": true, 00:19:17.493 "flush": true, 00:19:17.493 "reset": true, 00:19:17.493 "nvme_admin": false, 00:19:17.493 "nvme_io": false, 00:19:17.493 "nvme_io_md": false, 00:19:17.493 "write_zeroes": true, 00:19:17.493 "zcopy": true, 00:19:17.493 "get_zone_info": false, 00:19:17.493 "zone_management": false, 00:19:17.493 "zone_append": false, 00:19:17.493 "compare": false, 00:19:17.493 "compare_and_write": false, 00:19:17.493 "abort": true, 00:19:17.493 "seek_hole": false, 00:19:17.493 "seek_data": false, 00:19:17.493 "copy": true, 00:19:17.493 "nvme_iov_md": false 00:19:17.493 }, 00:19:17.493 "memory_domains": [ 00:19:17.493 { 00:19:17.493 "dma_device_id": "system", 00:19:17.493 "dma_device_type": 1 00:19:17.493 }, 00:19:17.493 { 00:19:17.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:17.493 "dma_device_type": 2 00:19:17.493 } 00:19:17.493 ], 00:19:17.493 "driver_specific": {} 00:19:17.493 } 00:19:17.493 ] 00:19:17.493 20:33:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:17.493 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:17.493 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:17.493 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:17.493 [2024-07-15 20:33:09.868446] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:17.493 [2024-07-15 20:33:09.868492] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:17.493 [2024-07-15 20:33:09.868512] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:17.493 [2024-07-15 20:33:09.869915] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:17.493 [2024-07-15 20:33:09.869970] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.752 20:33:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.010 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.010 "name": "Existed_Raid", 00:19:18.010 "uuid": "0f74acc7-a823-45bf-b479-e4a6ca61ef98", 00:19:18.010 "strip_size_kb": 64, 00:19:18.010 "state": "configuring", 00:19:18.010 "raid_level": "raid0", 00:19:18.010 "superblock": true, 00:19:18.010 "num_base_bdevs": 4, 00:19:18.010 "num_base_bdevs_discovered": 3, 00:19:18.010 "num_base_bdevs_operational": 4, 00:19:18.010 "base_bdevs_list": [ 00:19:18.010 { 00:19:18.010 "name": "BaseBdev1", 00:19:18.010 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.010 "is_configured": false, 00:19:18.010 "data_offset": 0, 00:19:18.010 "data_size": 0 00:19:18.010 }, 00:19:18.010 { 00:19:18.010 "name": "BaseBdev2", 00:19:18.010 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:18.010 "is_configured": true, 00:19:18.010 "data_offset": 2048, 00:19:18.010 "data_size": 63488 00:19:18.010 }, 00:19:18.010 { 00:19:18.010 "name": "BaseBdev3", 00:19:18.010 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:18.010 "is_configured": true, 00:19:18.010 "data_offset": 2048, 00:19:18.010 "data_size": 63488 00:19:18.010 }, 00:19:18.010 { 00:19:18.010 "name": "BaseBdev4", 00:19:18.010 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:18.010 "is_configured": true, 00:19:18.010 "data_offset": 2048, 00:19:18.010 "data_size": 63488 00:19:18.010 } 00:19:18.010 ] 00:19:18.010 }' 00:19:18.010 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.010 20:33:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:18.656 [2024-07-15 20:33:10.963307] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.656 20:33:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.915 20:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.915 "name": "Existed_Raid", 00:19:18.915 "uuid": "0f74acc7-a823-45bf-b479-e4a6ca61ef98", 00:19:18.915 "strip_size_kb": 64, 00:19:18.915 "state": "configuring", 00:19:18.915 "raid_level": "raid0", 00:19:18.915 "superblock": true, 00:19:18.915 "num_base_bdevs": 4, 00:19:18.915 "num_base_bdevs_discovered": 2, 00:19:18.915 "num_base_bdevs_operational": 4, 00:19:18.915 "base_bdevs_list": [ 00:19:18.915 { 00:19:18.915 "name": "BaseBdev1", 00:19:18.915 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.915 "is_configured": false, 00:19:18.915 "data_offset": 0, 00:19:18.915 "data_size": 0 00:19:18.915 }, 00:19:18.915 { 00:19:18.915 "name": null, 00:19:18.915 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:18.915 "is_configured": false, 00:19:18.915 "data_offset": 2048, 00:19:18.915 "data_size": 63488 00:19:18.915 }, 00:19:18.915 { 00:19:18.915 "name": "BaseBdev3", 00:19:18.915 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:18.915 "is_configured": true, 00:19:18.915 "data_offset": 2048, 00:19:18.915 "data_size": 63488 00:19:18.915 }, 00:19:18.915 { 00:19:18.915 "name": "BaseBdev4", 00:19:18.915 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:18.915 "is_configured": true, 00:19:18.915 "data_offset": 2048, 00:19:18.915 "data_size": 63488 00:19:18.915 } 00:19:18.915 ] 00:19:18.915 }' 00:19:18.915 20:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.915 20:33:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:19.480 20:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.480 20:33:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:19.738 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:19.738 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:19.996 [2024-07-15 20:33:12.315479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:19.996 BaseBdev1 00:19:19.996 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:19.996 20:33:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:19.996 20:33:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:19.996 20:33:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:19.996 20:33:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:19.996 20:33:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:19.996 20:33:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:20.254 20:33:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:20.512 [ 00:19:20.512 { 00:19:20.512 "name": "BaseBdev1", 00:19:20.512 "aliases": [ 00:19:20.512 "9f6d4214-2f5e-421e-9893-c571eeacad57" 00:19:20.512 ], 00:19:20.512 "product_name": "Malloc disk", 00:19:20.512 "block_size": 512, 00:19:20.512 "num_blocks": 65536, 00:19:20.512 "uuid": "9f6d4214-2f5e-421e-9893-c571eeacad57", 00:19:20.512 "assigned_rate_limits": { 00:19:20.512 "rw_ios_per_sec": 0, 00:19:20.512 "rw_mbytes_per_sec": 0, 00:19:20.512 "r_mbytes_per_sec": 0, 00:19:20.512 "w_mbytes_per_sec": 0 00:19:20.512 }, 00:19:20.512 "claimed": true, 00:19:20.512 "claim_type": "exclusive_write", 00:19:20.512 "zoned": false, 00:19:20.512 "supported_io_types": { 00:19:20.512 "read": true, 00:19:20.512 "write": true, 00:19:20.512 "unmap": true, 00:19:20.512 "flush": true, 00:19:20.512 "reset": true, 00:19:20.512 "nvme_admin": false, 00:19:20.512 "nvme_io": false, 00:19:20.512 "nvme_io_md": false, 00:19:20.512 "write_zeroes": true, 00:19:20.512 "zcopy": true, 00:19:20.512 "get_zone_info": false, 00:19:20.512 "zone_management": false, 00:19:20.512 "zone_append": false, 00:19:20.512 "compare": false, 00:19:20.512 "compare_and_write": false, 00:19:20.512 "abort": true, 00:19:20.512 "seek_hole": false, 00:19:20.512 "seek_data": false, 00:19:20.512 "copy": true, 00:19:20.512 "nvme_iov_md": false 00:19:20.512 }, 00:19:20.512 "memory_domains": [ 00:19:20.512 { 00:19:20.512 "dma_device_id": "system", 00:19:20.512 "dma_device_type": 1 00:19:20.512 }, 00:19:20.512 { 00:19:20.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.512 "dma_device_type": 2 00:19:20.512 } 00:19:20.512 ], 00:19:20.512 "driver_specific": {} 00:19:20.512 } 00:19:20.512 ] 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.512 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.770 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.771 "name": "Existed_Raid", 00:19:20.771 "uuid": "0f74acc7-a823-45bf-b479-e4a6ca61ef98", 00:19:20.771 "strip_size_kb": 64, 00:19:20.771 "state": "configuring", 00:19:20.771 "raid_level": "raid0", 00:19:20.771 "superblock": true, 00:19:20.771 "num_base_bdevs": 4, 00:19:20.771 "num_base_bdevs_discovered": 3, 00:19:20.771 "num_base_bdevs_operational": 4, 00:19:20.771 "base_bdevs_list": [ 00:19:20.771 { 00:19:20.771 "name": "BaseBdev1", 00:19:20.771 "uuid": "9f6d4214-2f5e-421e-9893-c571eeacad57", 00:19:20.771 "is_configured": true, 00:19:20.771 "data_offset": 2048, 00:19:20.771 "data_size": 63488 00:19:20.771 }, 00:19:20.771 { 00:19:20.771 "name": null, 00:19:20.771 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:20.771 "is_configured": false, 00:19:20.771 "data_offset": 2048, 00:19:20.771 "data_size": 63488 00:19:20.771 }, 00:19:20.771 { 00:19:20.771 "name": "BaseBdev3", 00:19:20.771 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:20.771 "is_configured": true, 00:19:20.771 "data_offset": 2048, 00:19:20.771 "data_size": 63488 00:19:20.771 }, 00:19:20.771 { 00:19:20.771 "name": "BaseBdev4", 00:19:20.771 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:20.771 "is_configured": true, 00:19:20.771 "data_offset": 2048, 00:19:20.771 "data_size": 63488 00:19:20.771 } 00:19:20.771 ] 00:19:20.771 }' 00:19:20.771 20:33:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.771 20:33:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:21.346 20:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:21.346 20:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.604 20:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:21.604 20:33:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:21.863 [2024-07-15 20:33:14.048090] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.863 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.121 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.121 "name": "Existed_Raid", 00:19:22.121 "uuid": "0f74acc7-a823-45bf-b479-e4a6ca61ef98", 00:19:22.121 "strip_size_kb": 64, 00:19:22.121 "state": "configuring", 00:19:22.121 "raid_level": "raid0", 00:19:22.121 "superblock": true, 00:19:22.121 "num_base_bdevs": 4, 00:19:22.121 "num_base_bdevs_discovered": 2, 00:19:22.121 "num_base_bdevs_operational": 4, 00:19:22.121 "base_bdevs_list": [ 00:19:22.121 { 00:19:22.121 "name": "BaseBdev1", 00:19:22.121 "uuid": "9f6d4214-2f5e-421e-9893-c571eeacad57", 00:19:22.121 "is_configured": true, 00:19:22.121 "data_offset": 2048, 00:19:22.121 "data_size": 63488 00:19:22.121 }, 00:19:22.121 { 00:19:22.121 "name": null, 00:19:22.121 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:22.121 "is_configured": false, 00:19:22.121 "data_offset": 2048, 00:19:22.121 "data_size": 63488 00:19:22.121 }, 00:19:22.121 { 00:19:22.121 "name": null, 00:19:22.121 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:22.121 "is_configured": false, 00:19:22.121 "data_offset": 2048, 00:19:22.121 "data_size": 63488 00:19:22.121 }, 00:19:22.121 { 00:19:22.121 "name": "BaseBdev4", 00:19:22.121 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:22.121 "is_configured": true, 00:19:22.121 "data_offset": 2048, 00:19:22.121 "data_size": 63488 00:19:22.121 } 00:19:22.121 ] 00:19:22.121 }' 00:19:22.121 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.122 20:33:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:22.688 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.688 20:33:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:22.946 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:22.946 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:23.205 [2024-07-15 20:33:15.359595] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.205 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:23.463 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:23.463 "name": "Existed_Raid", 00:19:23.463 "uuid": "0f74acc7-a823-45bf-b479-e4a6ca61ef98", 00:19:23.463 "strip_size_kb": 64, 00:19:23.463 "state": "configuring", 00:19:23.463 "raid_level": "raid0", 00:19:23.463 "superblock": true, 00:19:23.463 "num_base_bdevs": 4, 00:19:23.463 "num_base_bdevs_discovered": 3, 00:19:23.463 "num_base_bdevs_operational": 4, 00:19:23.463 "base_bdevs_list": [ 00:19:23.463 { 00:19:23.463 "name": "BaseBdev1", 00:19:23.463 "uuid": "9f6d4214-2f5e-421e-9893-c571eeacad57", 00:19:23.463 "is_configured": true, 00:19:23.463 "data_offset": 2048, 00:19:23.463 "data_size": 63488 00:19:23.463 }, 00:19:23.463 { 00:19:23.463 "name": null, 00:19:23.463 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:23.463 "is_configured": false, 00:19:23.463 "data_offset": 2048, 00:19:23.463 "data_size": 63488 00:19:23.463 }, 00:19:23.463 { 00:19:23.463 "name": "BaseBdev3", 00:19:23.463 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:23.463 "is_configured": true, 00:19:23.463 "data_offset": 2048, 00:19:23.463 "data_size": 63488 00:19:23.463 }, 00:19:23.463 { 00:19:23.463 "name": "BaseBdev4", 00:19:23.463 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:23.463 "is_configured": true, 00:19:23.463 "data_offset": 2048, 00:19:23.463 "data_size": 63488 00:19:23.463 } 00:19:23.463 ] 00:19:23.463 }' 00:19:23.463 20:33:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:23.463 20:33:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:24.060 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.060 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:24.318 [2024-07-15 20:33:16.622963] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.318 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:24.577 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.577 "name": "Existed_Raid", 00:19:24.577 "uuid": "0f74acc7-a823-45bf-b479-e4a6ca61ef98", 00:19:24.577 "strip_size_kb": 64, 00:19:24.577 "state": "configuring", 00:19:24.577 "raid_level": "raid0", 00:19:24.577 "superblock": true, 00:19:24.577 "num_base_bdevs": 4, 00:19:24.577 "num_base_bdevs_discovered": 2, 00:19:24.577 "num_base_bdevs_operational": 4, 00:19:24.577 "base_bdevs_list": [ 00:19:24.577 { 00:19:24.577 "name": null, 00:19:24.577 "uuid": "9f6d4214-2f5e-421e-9893-c571eeacad57", 00:19:24.577 "is_configured": false, 00:19:24.577 "data_offset": 2048, 00:19:24.577 "data_size": 63488 00:19:24.577 }, 00:19:24.577 { 00:19:24.577 "name": null, 00:19:24.577 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:24.577 "is_configured": false, 00:19:24.577 "data_offset": 2048, 00:19:24.577 "data_size": 63488 00:19:24.577 }, 00:19:24.577 { 00:19:24.577 "name": "BaseBdev3", 00:19:24.577 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:24.577 "is_configured": true, 00:19:24.577 "data_offset": 2048, 00:19:24.577 "data_size": 63488 00:19:24.577 }, 00:19:24.577 { 00:19:24.577 "name": "BaseBdev4", 00:19:24.577 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:24.577 "is_configured": true, 00:19:24.577 "data_offset": 2048, 00:19:24.577 "data_size": 63488 00:19:24.577 } 00:19:24.577 ] 00:19:24.577 }' 00:19:24.577 20:33:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.577 20:33:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:25.950 20:33:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.950 20:33:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:26.210 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:26.210 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:26.469 [2024-07-15 20:33:18.634820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.469 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:26.728 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:26.728 "name": "Existed_Raid", 00:19:26.728 "uuid": "0f74acc7-a823-45bf-b479-e4a6ca61ef98", 00:19:26.728 "strip_size_kb": 64, 00:19:26.728 "state": "configuring", 00:19:26.728 "raid_level": "raid0", 00:19:26.728 "superblock": true, 00:19:26.728 "num_base_bdevs": 4, 00:19:26.728 "num_base_bdevs_discovered": 3, 00:19:26.728 "num_base_bdevs_operational": 4, 00:19:26.728 "base_bdevs_list": [ 00:19:26.728 { 00:19:26.728 "name": null, 00:19:26.728 "uuid": "9f6d4214-2f5e-421e-9893-c571eeacad57", 00:19:26.728 "is_configured": false, 00:19:26.728 "data_offset": 2048, 00:19:26.728 "data_size": 63488 00:19:26.728 }, 00:19:26.728 { 00:19:26.728 "name": "BaseBdev2", 00:19:26.728 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:26.728 "is_configured": true, 00:19:26.728 "data_offset": 2048, 00:19:26.728 "data_size": 63488 00:19:26.728 }, 00:19:26.728 { 00:19:26.728 "name": "BaseBdev3", 00:19:26.728 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:26.728 "is_configured": true, 00:19:26.728 "data_offset": 2048, 00:19:26.728 "data_size": 63488 00:19:26.728 }, 00:19:26.728 { 00:19:26.728 "name": "BaseBdev4", 00:19:26.728 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:26.728 "is_configured": true, 00:19:26.728 "data_offset": 2048, 00:19:26.728 "data_size": 63488 00:19:26.728 } 00:19:26.728 ] 00:19:26.728 }' 00:19:26.728 20:33:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:26.728 20:33:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:27.665 20:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.665 20:33:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:27.665 20:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:27.924 20:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.924 20:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:27.924 20:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9f6d4214-2f5e-421e-9893-c571eeacad57 00:19:28.182 [2024-07-15 20:33:20.532261] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:28.182 [2024-07-15 20:33:20.532435] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12ec470 00:19:28.182 [2024-07-15 20:33:20.532448] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:28.182 [2024-07-15 20:33:20.532627] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12dcc40 00:19:28.182 [2024-07-15 20:33:20.532743] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12ec470 00:19:28.182 [2024-07-15 20:33:20.532753] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12ec470 00:19:28.182 [2024-07-15 20:33:20.532846] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:28.182 NewBaseBdev 00:19:28.182 20:33:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:28.182 20:33:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:28.182 20:33:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:28.182 20:33:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:28.182 20:33:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:28.182 20:33:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:28.182 20:33:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:28.442 20:33:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:28.701 [ 00:19:28.702 { 00:19:28.702 "name": "NewBaseBdev", 00:19:28.702 "aliases": [ 00:19:28.702 "9f6d4214-2f5e-421e-9893-c571eeacad57" 00:19:28.702 ], 00:19:28.702 "product_name": "Malloc disk", 00:19:28.702 "block_size": 512, 00:19:28.702 "num_blocks": 65536, 00:19:28.702 "uuid": "9f6d4214-2f5e-421e-9893-c571eeacad57", 00:19:28.702 "assigned_rate_limits": { 00:19:28.702 "rw_ios_per_sec": 0, 00:19:28.702 "rw_mbytes_per_sec": 0, 00:19:28.702 "r_mbytes_per_sec": 0, 00:19:28.702 "w_mbytes_per_sec": 0 00:19:28.702 }, 00:19:28.702 "claimed": true, 00:19:28.702 "claim_type": "exclusive_write", 00:19:28.702 "zoned": false, 00:19:28.702 "supported_io_types": { 00:19:28.702 "read": true, 00:19:28.702 "write": true, 00:19:28.702 "unmap": true, 00:19:28.702 "flush": true, 00:19:28.702 "reset": true, 00:19:28.702 "nvme_admin": false, 00:19:28.702 "nvme_io": false, 00:19:28.702 "nvme_io_md": false, 00:19:28.702 "write_zeroes": true, 00:19:28.702 "zcopy": true, 00:19:28.702 "get_zone_info": false, 00:19:28.702 "zone_management": false, 00:19:28.702 "zone_append": false, 00:19:28.702 "compare": false, 00:19:28.702 "compare_and_write": false, 00:19:28.702 "abort": true, 00:19:28.702 "seek_hole": false, 00:19:28.702 "seek_data": false, 00:19:28.702 "copy": true, 00:19:28.702 "nvme_iov_md": false 00:19:28.702 }, 00:19:28.702 "memory_domains": [ 00:19:28.702 { 00:19:28.702 "dma_device_id": "system", 00:19:28.702 "dma_device_type": 1 00:19:28.702 }, 00:19:28.702 { 00:19:28.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.702 "dma_device_type": 2 00:19:28.702 } 00:19:28.702 ], 00:19:28.702 "driver_specific": {} 00:19:28.702 } 00:19:28.702 ] 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.702 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:28.961 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:28.961 "name": "Existed_Raid", 00:19:28.961 "uuid": "0f74acc7-a823-45bf-b479-e4a6ca61ef98", 00:19:28.961 "strip_size_kb": 64, 00:19:28.961 "state": "online", 00:19:28.961 "raid_level": "raid0", 00:19:28.961 "superblock": true, 00:19:28.961 "num_base_bdevs": 4, 00:19:28.961 "num_base_bdevs_discovered": 4, 00:19:28.961 "num_base_bdevs_operational": 4, 00:19:28.961 "base_bdevs_list": [ 00:19:28.961 { 00:19:28.961 "name": "NewBaseBdev", 00:19:28.961 "uuid": "9f6d4214-2f5e-421e-9893-c571eeacad57", 00:19:28.961 "is_configured": true, 00:19:28.961 "data_offset": 2048, 00:19:28.961 "data_size": 63488 00:19:28.961 }, 00:19:28.961 { 00:19:28.961 "name": "BaseBdev2", 00:19:28.961 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:28.961 "is_configured": true, 00:19:28.961 "data_offset": 2048, 00:19:28.961 "data_size": 63488 00:19:28.961 }, 00:19:28.961 { 00:19:28.961 "name": "BaseBdev3", 00:19:28.961 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:28.961 "is_configured": true, 00:19:28.961 "data_offset": 2048, 00:19:28.961 "data_size": 63488 00:19:28.961 }, 00:19:28.961 { 00:19:28.961 "name": "BaseBdev4", 00:19:28.961 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:28.961 "is_configured": true, 00:19:28.961 "data_offset": 2048, 00:19:28.961 "data_size": 63488 00:19:28.961 } 00:19:28.961 ] 00:19:28.961 }' 00:19:28.961 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:28.961 20:33:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:29.900 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:29.900 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:29.900 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:29.900 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:29.900 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:29.900 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:29.900 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:29.900 20:33:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:29.900 [2024-07-15 20:33:22.152887] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:29.900 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:29.900 "name": "Existed_Raid", 00:19:29.900 "aliases": [ 00:19:29.900 "0f74acc7-a823-45bf-b479-e4a6ca61ef98" 00:19:29.900 ], 00:19:29.900 "product_name": "Raid Volume", 00:19:29.900 "block_size": 512, 00:19:29.900 "num_blocks": 253952, 00:19:29.900 "uuid": "0f74acc7-a823-45bf-b479-e4a6ca61ef98", 00:19:29.900 "assigned_rate_limits": { 00:19:29.900 "rw_ios_per_sec": 0, 00:19:29.900 "rw_mbytes_per_sec": 0, 00:19:29.900 "r_mbytes_per_sec": 0, 00:19:29.900 "w_mbytes_per_sec": 0 00:19:29.900 }, 00:19:29.900 "claimed": false, 00:19:29.900 "zoned": false, 00:19:29.900 "supported_io_types": { 00:19:29.900 "read": true, 00:19:29.900 "write": true, 00:19:29.900 "unmap": true, 00:19:29.900 "flush": true, 00:19:29.900 "reset": true, 00:19:29.900 "nvme_admin": false, 00:19:29.900 "nvme_io": false, 00:19:29.900 "nvme_io_md": false, 00:19:29.900 "write_zeroes": true, 00:19:29.900 "zcopy": false, 00:19:29.900 "get_zone_info": false, 00:19:29.900 "zone_management": false, 00:19:29.900 "zone_append": false, 00:19:29.900 "compare": false, 00:19:29.900 "compare_and_write": false, 00:19:29.900 "abort": false, 00:19:29.900 "seek_hole": false, 00:19:29.900 "seek_data": false, 00:19:29.900 "copy": false, 00:19:29.900 "nvme_iov_md": false 00:19:29.900 }, 00:19:29.900 "memory_domains": [ 00:19:29.900 { 00:19:29.900 "dma_device_id": "system", 00:19:29.900 "dma_device_type": 1 00:19:29.900 }, 00:19:29.900 { 00:19:29.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.900 "dma_device_type": 2 00:19:29.900 }, 00:19:29.900 { 00:19:29.900 "dma_device_id": "system", 00:19:29.900 "dma_device_type": 1 00:19:29.900 }, 00:19:29.900 { 00:19:29.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.900 "dma_device_type": 2 00:19:29.900 }, 00:19:29.900 { 00:19:29.900 "dma_device_id": "system", 00:19:29.900 "dma_device_type": 1 00:19:29.900 }, 00:19:29.900 { 00:19:29.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.900 "dma_device_type": 2 00:19:29.900 }, 00:19:29.900 { 00:19:29.900 "dma_device_id": "system", 00:19:29.900 "dma_device_type": 1 00:19:29.900 }, 00:19:29.900 { 00:19:29.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.900 "dma_device_type": 2 00:19:29.900 } 00:19:29.900 ], 00:19:29.900 "driver_specific": { 00:19:29.900 "raid": { 00:19:29.900 "uuid": "0f74acc7-a823-45bf-b479-e4a6ca61ef98", 00:19:29.900 "strip_size_kb": 64, 00:19:29.900 "state": "online", 00:19:29.900 "raid_level": "raid0", 00:19:29.900 "superblock": true, 00:19:29.900 "num_base_bdevs": 4, 00:19:29.900 "num_base_bdevs_discovered": 4, 00:19:29.900 "num_base_bdevs_operational": 4, 00:19:29.900 "base_bdevs_list": [ 00:19:29.900 { 00:19:29.900 "name": "NewBaseBdev", 00:19:29.900 "uuid": "9f6d4214-2f5e-421e-9893-c571eeacad57", 00:19:29.900 "is_configured": true, 00:19:29.900 "data_offset": 2048, 00:19:29.900 "data_size": 63488 00:19:29.900 }, 00:19:29.900 { 00:19:29.900 "name": "BaseBdev2", 00:19:29.900 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:29.900 "is_configured": true, 00:19:29.900 "data_offset": 2048, 00:19:29.900 "data_size": 63488 00:19:29.900 }, 00:19:29.900 { 00:19:29.900 "name": "BaseBdev3", 00:19:29.900 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:29.900 "is_configured": true, 00:19:29.900 "data_offset": 2048, 00:19:29.900 "data_size": 63488 00:19:29.900 }, 00:19:29.900 { 00:19:29.900 "name": "BaseBdev4", 00:19:29.900 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:29.900 "is_configured": true, 00:19:29.900 "data_offset": 2048, 00:19:29.900 "data_size": 63488 00:19:29.900 } 00:19:29.900 ] 00:19:29.900 } 00:19:29.900 } 00:19:29.900 }' 00:19:29.900 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:29.900 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:29.900 BaseBdev2 00:19:29.900 BaseBdev3 00:19:29.900 BaseBdev4' 00:19:29.900 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.900 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:29.900 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:30.159 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:30.159 "name": "NewBaseBdev", 00:19:30.159 "aliases": [ 00:19:30.159 "9f6d4214-2f5e-421e-9893-c571eeacad57" 00:19:30.159 ], 00:19:30.159 "product_name": "Malloc disk", 00:19:30.159 "block_size": 512, 00:19:30.159 "num_blocks": 65536, 00:19:30.159 "uuid": "9f6d4214-2f5e-421e-9893-c571eeacad57", 00:19:30.159 "assigned_rate_limits": { 00:19:30.159 "rw_ios_per_sec": 0, 00:19:30.159 "rw_mbytes_per_sec": 0, 00:19:30.159 "r_mbytes_per_sec": 0, 00:19:30.159 "w_mbytes_per_sec": 0 00:19:30.159 }, 00:19:30.159 "claimed": true, 00:19:30.159 "claim_type": "exclusive_write", 00:19:30.159 "zoned": false, 00:19:30.159 "supported_io_types": { 00:19:30.159 "read": true, 00:19:30.159 "write": true, 00:19:30.159 "unmap": true, 00:19:30.159 "flush": true, 00:19:30.159 "reset": true, 00:19:30.159 "nvme_admin": false, 00:19:30.159 "nvme_io": false, 00:19:30.159 "nvme_io_md": false, 00:19:30.159 "write_zeroes": true, 00:19:30.159 "zcopy": true, 00:19:30.159 "get_zone_info": false, 00:19:30.159 "zone_management": false, 00:19:30.159 "zone_append": false, 00:19:30.159 "compare": false, 00:19:30.159 "compare_and_write": false, 00:19:30.159 "abort": true, 00:19:30.159 "seek_hole": false, 00:19:30.159 "seek_data": false, 00:19:30.159 "copy": true, 00:19:30.159 "nvme_iov_md": false 00:19:30.159 }, 00:19:30.159 "memory_domains": [ 00:19:30.159 { 00:19:30.159 "dma_device_id": "system", 00:19:30.159 "dma_device_type": 1 00:19:30.159 }, 00:19:30.159 { 00:19:30.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.159 "dma_device_type": 2 00:19:30.159 } 00:19:30.159 ], 00:19:30.159 "driver_specific": {} 00:19:30.159 }' 00:19:30.159 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.159 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.418 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:30.418 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.418 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.418 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:30.418 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.418 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.677 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:30.677 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.677 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.677 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:30.677 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:30.677 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:30.677 20:33:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:31.245 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:31.245 "name": "BaseBdev2", 00:19:31.245 "aliases": [ 00:19:31.245 "60400b26-7daa-4a46-8edf-48fb53276e03" 00:19:31.245 ], 00:19:31.245 "product_name": "Malloc disk", 00:19:31.245 "block_size": 512, 00:19:31.245 "num_blocks": 65536, 00:19:31.245 "uuid": "60400b26-7daa-4a46-8edf-48fb53276e03", 00:19:31.245 "assigned_rate_limits": { 00:19:31.245 "rw_ios_per_sec": 0, 00:19:31.245 "rw_mbytes_per_sec": 0, 00:19:31.245 "r_mbytes_per_sec": 0, 00:19:31.245 "w_mbytes_per_sec": 0 00:19:31.245 }, 00:19:31.245 "claimed": true, 00:19:31.245 "claim_type": "exclusive_write", 00:19:31.245 "zoned": false, 00:19:31.245 "supported_io_types": { 00:19:31.245 "read": true, 00:19:31.245 "write": true, 00:19:31.245 "unmap": true, 00:19:31.245 "flush": true, 00:19:31.245 "reset": true, 00:19:31.245 "nvme_admin": false, 00:19:31.245 "nvme_io": false, 00:19:31.245 "nvme_io_md": false, 00:19:31.245 "write_zeroes": true, 00:19:31.245 "zcopy": true, 00:19:31.245 "get_zone_info": false, 00:19:31.245 "zone_management": false, 00:19:31.245 "zone_append": false, 00:19:31.245 "compare": false, 00:19:31.245 "compare_and_write": false, 00:19:31.245 "abort": true, 00:19:31.245 "seek_hole": false, 00:19:31.245 "seek_data": false, 00:19:31.245 "copy": true, 00:19:31.245 "nvme_iov_md": false 00:19:31.245 }, 00:19:31.245 "memory_domains": [ 00:19:31.245 { 00:19:31.245 "dma_device_id": "system", 00:19:31.245 "dma_device_type": 1 00:19:31.245 }, 00:19:31.245 { 00:19:31.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:31.245 "dma_device_type": 2 00:19:31.245 } 00:19:31.245 ], 00:19:31.245 "driver_specific": {} 00:19:31.245 }' 00:19:31.245 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:31.245 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:31.245 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:31.245 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:31.504 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:31.504 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:31.504 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.504 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.504 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:31.504 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.504 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.763 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:31.763 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:31.763 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:31.763 20:33:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:32.022 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:32.022 "name": "BaseBdev3", 00:19:32.022 "aliases": [ 00:19:32.022 "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3" 00:19:32.022 ], 00:19:32.022 "product_name": "Malloc disk", 00:19:32.022 "block_size": 512, 00:19:32.022 "num_blocks": 65536, 00:19:32.022 "uuid": "98c6b520-0227-4b9f-a7e0-b4120c7fc4b3", 00:19:32.022 "assigned_rate_limits": { 00:19:32.022 "rw_ios_per_sec": 0, 00:19:32.022 "rw_mbytes_per_sec": 0, 00:19:32.022 "r_mbytes_per_sec": 0, 00:19:32.022 "w_mbytes_per_sec": 0 00:19:32.022 }, 00:19:32.022 "claimed": true, 00:19:32.022 "claim_type": "exclusive_write", 00:19:32.022 "zoned": false, 00:19:32.022 "supported_io_types": { 00:19:32.022 "read": true, 00:19:32.022 "write": true, 00:19:32.022 "unmap": true, 00:19:32.022 "flush": true, 00:19:32.022 "reset": true, 00:19:32.022 "nvme_admin": false, 00:19:32.022 "nvme_io": false, 00:19:32.022 "nvme_io_md": false, 00:19:32.022 "write_zeroes": true, 00:19:32.022 "zcopy": true, 00:19:32.022 "get_zone_info": false, 00:19:32.022 "zone_management": false, 00:19:32.022 "zone_append": false, 00:19:32.022 "compare": false, 00:19:32.022 "compare_and_write": false, 00:19:32.022 "abort": true, 00:19:32.022 "seek_hole": false, 00:19:32.022 "seek_data": false, 00:19:32.022 "copy": true, 00:19:32.022 "nvme_iov_md": false 00:19:32.022 }, 00:19:32.022 "memory_domains": [ 00:19:32.022 { 00:19:32.022 "dma_device_id": "system", 00:19:32.022 "dma_device_type": 1 00:19:32.022 }, 00:19:32.022 { 00:19:32.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:32.022 "dma_device_type": 2 00:19:32.022 } 00:19:32.022 ], 00:19:32.022 "driver_specific": {} 00:19:32.022 }' 00:19:32.022 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:32.022 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:32.022 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:32.022 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:32.281 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:32.281 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:32.282 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:32.282 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:32.282 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:32.282 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:32.282 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:32.541 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:32.541 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:32.541 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:32.541 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:32.818 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:32.818 "name": "BaseBdev4", 00:19:32.818 "aliases": [ 00:19:32.818 "e14df952-cb92-48e8-bd85-b4d377ed96a3" 00:19:32.818 ], 00:19:32.818 "product_name": "Malloc disk", 00:19:32.818 "block_size": 512, 00:19:32.818 "num_blocks": 65536, 00:19:32.818 "uuid": "e14df952-cb92-48e8-bd85-b4d377ed96a3", 00:19:32.818 "assigned_rate_limits": { 00:19:32.818 "rw_ios_per_sec": 0, 00:19:32.818 "rw_mbytes_per_sec": 0, 00:19:32.818 "r_mbytes_per_sec": 0, 00:19:32.818 "w_mbytes_per_sec": 0 00:19:32.818 }, 00:19:32.818 "claimed": true, 00:19:32.818 "claim_type": "exclusive_write", 00:19:32.818 "zoned": false, 00:19:32.818 "supported_io_types": { 00:19:32.818 "read": true, 00:19:32.818 "write": true, 00:19:32.818 "unmap": true, 00:19:32.818 "flush": true, 00:19:32.818 "reset": true, 00:19:32.818 "nvme_admin": false, 00:19:32.818 "nvme_io": false, 00:19:32.818 "nvme_io_md": false, 00:19:32.818 "write_zeroes": true, 00:19:32.818 "zcopy": true, 00:19:32.818 "get_zone_info": false, 00:19:32.818 "zone_management": false, 00:19:32.818 "zone_append": false, 00:19:32.818 "compare": false, 00:19:32.818 "compare_and_write": false, 00:19:32.818 "abort": true, 00:19:32.818 "seek_hole": false, 00:19:32.818 "seek_data": false, 00:19:32.818 "copy": true, 00:19:32.818 "nvme_iov_md": false 00:19:32.818 }, 00:19:32.818 "memory_domains": [ 00:19:32.818 { 00:19:32.818 "dma_device_id": "system", 00:19:32.818 "dma_device_type": 1 00:19:32.818 }, 00:19:32.818 { 00:19:32.818 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:32.818 "dma_device_type": 2 00:19:32.818 } 00:19:32.818 ], 00:19:32.818 "driver_specific": {} 00:19:32.818 }' 00:19:32.818 20:33:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:32.818 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:32.818 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:32.818 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:32.818 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:33.097 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:33.097 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:33.097 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:33.097 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:33.097 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:33.097 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:33.097 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:33.097 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:33.355 [2024-07-15 20:33:25.645866] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:33.355 [2024-07-15 20:33:25.645894] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:33.355 [2024-07-15 20:33:25.645955] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:33.355 [2024-07-15 20:33:25.646020] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:33.355 [2024-07-15 20:33:25.646032] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ec470 name Existed_Raid, state offline 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1417759 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1417759 ']' 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1417759 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1417759 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1417759' 00:19:33.355 killing process with pid 1417759 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1417759 00:19:33.355 [2024-07-15 20:33:25.717605] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:33.355 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1417759 00:19:33.615 [2024-07-15 20:33:25.753964] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:33.615 20:33:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:33.615 00:19:33.615 real 0m33.743s 00:19:33.615 user 1m2.087s 00:19:33.615 sys 0m5.899s 00:19:33.615 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:33.615 20:33:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:33.615 ************************************ 00:19:33.615 END TEST raid_state_function_test_sb 00:19:33.615 ************************************ 00:19:33.872 20:33:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:33.872 20:33:26 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:19:33.872 20:33:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:19:33.872 20:33:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:33.872 20:33:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:33.872 ************************************ 00:19:33.872 START TEST raid_superblock_test 00:19:33.872 ************************************ 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1423311 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1423311 /var/tmp/spdk-raid.sock 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1423311 ']' 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:33.872 20:33:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:33.873 20:33:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:33.873 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:33.873 20:33:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:33.873 20:33:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.873 [2024-07-15 20:33:26.112122] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:19:33.873 [2024-07-15 20:33:26.112191] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1423311 ] 00:19:33.873 [2024-07-15 20:33:26.231818] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:34.131 [2024-07-15 20:33:26.337471] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:34.131 [2024-07-15 20:33:26.404814] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:34.131 [2024-07-15 20:33:26.404851] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:35.067 malloc1 00:19:35.067 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:35.325 [2024-07-15 20:33:27.671798] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:35.325 [2024-07-15 20:33:27.671840] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:35.325 [2024-07-15 20:33:27.671860] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b3570 00:19:35.325 [2024-07-15 20:33:27.671872] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:35.325 [2024-07-15 20:33:27.673453] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:35.325 [2024-07-15 20:33:27.673480] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:35.325 pt1 00:19:35.325 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:35.325 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:35.325 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:19:35.325 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:19:35.325 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:35.325 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:35.325 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:35.325 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:35.325 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:35.583 malloc2 00:19:35.583 20:33:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:35.842 [2024-07-15 20:33:28.181794] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:35.842 [2024-07-15 20:33:28.181835] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:35.842 [2024-07-15 20:33:28.181852] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b4970 00:19:35.842 [2024-07-15 20:33:28.181864] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:35.842 [2024-07-15 20:33:28.183296] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:35.842 [2024-07-15 20:33:28.183322] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:35.842 pt2 00:19:35.842 20:33:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:35.842 20:33:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:35.842 20:33:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:19:35.842 20:33:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:19:35.842 20:33:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:35.842 20:33:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:35.842 20:33:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:35.842 20:33:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:35.842 20:33:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:36.410 malloc3 00:19:36.410 20:33:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:36.978 [2024-07-15 20:33:29.214329] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:36.978 [2024-07-15 20:33:29.214375] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:36.978 [2024-07-15 20:33:29.214393] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x224b340 00:19:36.978 [2024-07-15 20:33:29.214406] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:36.978 [2024-07-15 20:33:29.215958] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:36.978 [2024-07-15 20:33:29.215985] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:36.978 pt3 00:19:36.978 20:33:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:36.978 20:33:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:36.978 20:33:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:19:36.978 20:33:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:19:36.978 20:33:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:36.978 20:33:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:36.978 20:33:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:36.978 20:33:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:36.978 20:33:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:37.547 malloc4 00:19:37.547 20:33:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:38.115 [2024-07-15 20:33:30.245643] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:38.115 [2024-07-15 20:33:30.245691] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:38.115 [2024-07-15 20:33:30.245719] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x224dc60 00:19:38.115 [2024-07-15 20:33:30.245732] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:38.115 [2024-07-15 20:33:30.247348] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:38.115 [2024-07-15 20:33:30.247375] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:38.115 pt4 00:19:38.115 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:38.115 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:38.115 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:38.683 [2024-07-15 20:33:30.758998] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:38.683 [2024-07-15 20:33:30.760357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:38.683 [2024-07-15 20:33:30.760412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:38.683 [2024-07-15 20:33:30.760454] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:38.683 [2024-07-15 20:33:30.760625] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20ab530 00:19:38.683 [2024-07-15 20:33:30.760636] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:38.683 [2024-07-15 20:33:30.760838] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20a9770 00:19:38.683 [2024-07-15 20:33:30.760995] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20ab530 00:19:38.683 [2024-07-15 20:33:30.761006] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20ab530 00:19:38.683 [2024-07-15 20:33:30.761103] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.683 20:33:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:38.942 20:33:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:38.942 "name": "raid_bdev1", 00:19:38.942 "uuid": "6695814e-026e-4689-a827-2664888c0240", 00:19:38.942 "strip_size_kb": 64, 00:19:38.942 "state": "online", 00:19:38.942 "raid_level": "raid0", 00:19:38.942 "superblock": true, 00:19:38.942 "num_base_bdevs": 4, 00:19:38.942 "num_base_bdevs_discovered": 4, 00:19:38.942 "num_base_bdevs_operational": 4, 00:19:38.942 "base_bdevs_list": [ 00:19:38.942 { 00:19:38.942 "name": "pt1", 00:19:38.942 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:38.942 "is_configured": true, 00:19:38.942 "data_offset": 2048, 00:19:38.942 "data_size": 63488 00:19:38.942 }, 00:19:38.942 { 00:19:38.942 "name": "pt2", 00:19:38.942 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:38.942 "is_configured": true, 00:19:38.942 "data_offset": 2048, 00:19:38.942 "data_size": 63488 00:19:38.942 }, 00:19:38.942 { 00:19:38.942 "name": "pt3", 00:19:38.942 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:38.942 "is_configured": true, 00:19:38.942 "data_offset": 2048, 00:19:38.942 "data_size": 63488 00:19:38.942 }, 00:19:38.942 { 00:19:38.942 "name": "pt4", 00:19:38.942 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:38.942 "is_configured": true, 00:19:38.942 "data_offset": 2048, 00:19:38.942 "data_size": 63488 00:19:38.942 } 00:19:38.942 ] 00:19:38.942 }' 00:19:38.942 20:33:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:38.942 20:33:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.879 20:33:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:39.879 20:33:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:39.879 20:33:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:39.879 20:33:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:39.879 20:33:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:39.879 20:33:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:39.879 20:33:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:39.879 20:33:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:39.879 [2024-07-15 20:33:32.054695] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:39.879 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:39.879 "name": "raid_bdev1", 00:19:39.879 "aliases": [ 00:19:39.879 "6695814e-026e-4689-a827-2664888c0240" 00:19:39.879 ], 00:19:39.879 "product_name": "Raid Volume", 00:19:39.879 "block_size": 512, 00:19:39.879 "num_blocks": 253952, 00:19:39.879 "uuid": "6695814e-026e-4689-a827-2664888c0240", 00:19:39.879 "assigned_rate_limits": { 00:19:39.879 "rw_ios_per_sec": 0, 00:19:39.879 "rw_mbytes_per_sec": 0, 00:19:39.879 "r_mbytes_per_sec": 0, 00:19:39.879 "w_mbytes_per_sec": 0 00:19:39.879 }, 00:19:39.879 "claimed": false, 00:19:39.879 "zoned": false, 00:19:39.879 "supported_io_types": { 00:19:39.879 "read": true, 00:19:39.879 "write": true, 00:19:39.879 "unmap": true, 00:19:39.879 "flush": true, 00:19:39.879 "reset": true, 00:19:39.879 "nvme_admin": false, 00:19:39.879 "nvme_io": false, 00:19:39.879 "nvme_io_md": false, 00:19:39.879 "write_zeroes": true, 00:19:39.879 "zcopy": false, 00:19:39.879 "get_zone_info": false, 00:19:39.879 "zone_management": false, 00:19:39.879 "zone_append": false, 00:19:39.879 "compare": false, 00:19:39.879 "compare_and_write": false, 00:19:39.879 "abort": false, 00:19:39.879 "seek_hole": false, 00:19:39.879 "seek_data": false, 00:19:39.879 "copy": false, 00:19:39.879 "nvme_iov_md": false 00:19:39.879 }, 00:19:39.879 "memory_domains": [ 00:19:39.879 { 00:19:39.879 "dma_device_id": "system", 00:19:39.879 "dma_device_type": 1 00:19:39.879 }, 00:19:39.879 { 00:19:39.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.879 "dma_device_type": 2 00:19:39.879 }, 00:19:39.879 { 00:19:39.879 "dma_device_id": "system", 00:19:39.879 "dma_device_type": 1 00:19:39.879 }, 00:19:39.879 { 00:19:39.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.879 "dma_device_type": 2 00:19:39.879 }, 00:19:39.879 { 00:19:39.879 "dma_device_id": "system", 00:19:39.879 "dma_device_type": 1 00:19:39.879 }, 00:19:39.879 { 00:19:39.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.879 "dma_device_type": 2 00:19:39.879 }, 00:19:39.879 { 00:19:39.879 "dma_device_id": "system", 00:19:39.879 "dma_device_type": 1 00:19:39.879 }, 00:19:39.879 { 00:19:39.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.879 "dma_device_type": 2 00:19:39.879 } 00:19:39.879 ], 00:19:39.879 "driver_specific": { 00:19:39.879 "raid": { 00:19:39.879 "uuid": "6695814e-026e-4689-a827-2664888c0240", 00:19:39.879 "strip_size_kb": 64, 00:19:39.879 "state": "online", 00:19:39.879 "raid_level": "raid0", 00:19:39.879 "superblock": true, 00:19:39.879 "num_base_bdevs": 4, 00:19:39.879 "num_base_bdevs_discovered": 4, 00:19:39.879 "num_base_bdevs_operational": 4, 00:19:39.879 "base_bdevs_list": [ 00:19:39.879 { 00:19:39.879 "name": "pt1", 00:19:39.879 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:39.879 "is_configured": true, 00:19:39.879 "data_offset": 2048, 00:19:39.880 "data_size": 63488 00:19:39.880 }, 00:19:39.880 { 00:19:39.880 "name": "pt2", 00:19:39.880 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:39.880 "is_configured": true, 00:19:39.880 "data_offset": 2048, 00:19:39.880 "data_size": 63488 00:19:39.880 }, 00:19:39.880 { 00:19:39.880 "name": "pt3", 00:19:39.880 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:39.880 "is_configured": true, 00:19:39.880 "data_offset": 2048, 00:19:39.880 "data_size": 63488 00:19:39.880 }, 00:19:39.880 { 00:19:39.880 "name": "pt4", 00:19:39.880 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:39.880 "is_configured": true, 00:19:39.880 "data_offset": 2048, 00:19:39.880 "data_size": 63488 00:19:39.880 } 00:19:39.880 ] 00:19:39.880 } 00:19:39.880 } 00:19:39.880 }' 00:19:39.880 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:39.880 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:39.880 pt2 00:19:39.880 pt3 00:19:39.880 pt4' 00:19:39.880 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:39.880 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:39.880 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:40.448 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:40.448 "name": "pt1", 00:19:40.448 "aliases": [ 00:19:40.448 "00000000-0000-0000-0000-000000000001" 00:19:40.448 ], 00:19:40.448 "product_name": "passthru", 00:19:40.448 "block_size": 512, 00:19:40.448 "num_blocks": 65536, 00:19:40.448 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:40.448 "assigned_rate_limits": { 00:19:40.448 "rw_ios_per_sec": 0, 00:19:40.448 "rw_mbytes_per_sec": 0, 00:19:40.448 "r_mbytes_per_sec": 0, 00:19:40.448 "w_mbytes_per_sec": 0 00:19:40.448 }, 00:19:40.448 "claimed": true, 00:19:40.448 "claim_type": "exclusive_write", 00:19:40.448 "zoned": false, 00:19:40.448 "supported_io_types": { 00:19:40.448 "read": true, 00:19:40.448 "write": true, 00:19:40.448 "unmap": true, 00:19:40.448 "flush": true, 00:19:40.448 "reset": true, 00:19:40.448 "nvme_admin": false, 00:19:40.448 "nvme_io": false, 00:19:40.448 "nvme_io_md": false, 00:19:40.448 "write_zeroes": true, 00:19:40.448 "zcopy": true, 00:19:40.448 "get_zone_info": false, 00:19:40.448 "zone_management": false, 00:19:40.448 "zone_append": false, 00:19:40.448 "compare": false, 00:19:40.448 "compare_and_write": false, 00:19:40.448 "abort": true, 00:19:40.448 "seek_hole": false, 00:19:40.448 "seek_data": false, 00:19:40.448 "copy": true, 00:19:40.448 "nvme_iov_md": false 00:19:40.448 }, 00:19:40.448 "memory_domains": [ 00:19:40.448 { 00:19:40.448 "dma_device_id": "system", 00:19:40.448 "dma_device_type": 1 00:19:40.448 }, 00:19:40.448 { 00:19:40.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.448 "dma_device_type": 2 00:19:40.448 } 00:19:40.448 ], 00:19:40.448 "driver_specific": { 00:19:40.448 "passthru": { 00:19:40.448 "name": "pt1", 00:19:40.448 "base_bdev_name": "malloc1" 00:19:40.448 } 00:19:40.448 } 00:19:40.448 }' 00:19:40.448 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.448 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.448 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:40.448 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.448 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.708 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:40.708 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.708 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.708 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:40.708 20:33:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.708 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.980 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:40.980 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:40.980 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:40.980 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:40.980 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:40.980 "name": "pt2", 00:19:40.980 "aliases": [ 00:19:40.980 "00000000-0000-0000-0000-000000000002" 00:19:40.980 ], 00:19:40.980 "product_name": "passthru", 00:19:40.980 "block_size": 512, 00:19:40.980 "num_blocks": 65536, 00:19:40.980 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:40.980 "assigned_rate_limits": { 00:19:40.980 "rw_ios_per_sec": 0, 00:19:40.980 "rw_mbytes_per_sec": 0, 00:19:40.980 "r_mbytes_per_sec": 0, 00:19:40.980 "w_mbytes_per_sec": 0 00:19:40.980 }, 00:19:40.980 "claimed": true, 00:19:40.980 "claim_type": "exclusive_write", 00:19:40.980 "zoned": false, 00:19:40.980 "supported_io_types": { 00:19:40.980 "read": true, 00:19:40.980 "write": true, 00:19:40.980 "unmap": true, 00:19:40.980 "flush": true, 00:19:40.980 "reset": true, 00:19:40.980 "nvme_admin": false, 00:19:40.980 "nvme_io": false, 00:19:40.980 "nvme_io_md": false, 00:19:40.980 "write_zeroes": true, 00:19:40.980 "zcopy": true, 00:19:40.980 "get_zone_info": false, 00:19:40.980 "zone_management": false, 00:19:40.980 "zone_append": false, 00:19:40.980 "compare": false, 00:19:40.980 "compare_and_write": false, 00:19:40.980 "abort": true, 00:19:40.980 "seek_hole": false, 00:19:40.980 "seek_data": false, 00:19:40.980 "copy": true, 00:19:40.980 "nvme_iov_md": false 00:19:40.980 }, 00:19:40.980 "memory_domains": [ 00:19:40.980 { 00:19:40.980 "dma_device_id": "system", 00:19:40.980 "dma_device_type": 1 00:19:40.980 }, 00:19:40.980 { 00:19:40.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.980 "dma_device_type": 2 00:19:40.980 } 00:19:40.980 ], 00:19:40.980 "driver_specific": { 00:19:40.980 "passthru": { 00:19:40.980 "name": "pt2", 00:19:40.980 "base_bdev_name": "malloc2" 00:19:40.980 } 00:19:40.980 } 00:19:40.980 }' 00:19:40.980 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.242 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.242 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:41.242 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.242 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.242 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:41.242 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.501 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.501 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:41.501 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.501 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.501 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:41.501 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:41.501 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:41.501 20:33:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.071 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.071 "name": "pt3", 00:19:42.071 "aliases": [ 00:19:42.071 "00000000-0000-0000-0000-000000000003" 00:19:42.071 ], 00:19:42.071 "product_name": "passthru", 00:19:42.071 "block_size": 512, 00:19:42.071 "num_blocks": 65536, 00:19:42.071 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:42.071 "assigned_rate_limits": { 00:19:42.071 "rw_ios_per_sec": 0, 00:19:42.071 "rw_mbytes_per_sec": 0, 00:19:42.071 "r_mbytes_per_sec": 0, 00:19:42.071 "w_mbytes_per_sec": 0 00:19:42.071 }, 00:19:42.071 "claimed": true, 00:19:42.071 "claim_type": "exclusive_write", 00:19:42.071 "zoned": false, 00:19:42.071 "supported_io_types": { 00:19:42.071 "read": true, 00:19:42.071 "write": true, 00:19:42.071 "unmap": true, 00:19:42.071 "flush": true, 00:19:42.071 "reset": true, 00:19:42.071 "nvme_admin": false, 00:19:42.071 "nvme_io": false, 00:19:42.071 "nvme_io_md": false, 00:19:42.071 "write_zeroes": true, 00:19:42.071 "zcopy": true, 00:19:42.071 "get_zone_info": false, 00:19:42.071 "zone_management": false, 00:19:42.071 "zone_append": false, 00:19:42.071 "compare": false, 00:19:42.071 "compare_and_write": false, 00:19:42.071 "abort": true, 00:19:42.071 "seek_hole": false, 00:19:42.071 "seek_data": false, 00:19:42.071 "copy": true, 00:19:42.071 "nvme_iov_md": false 00:19:42.071 }, 00:19:42.071 "memory_domains": [ 00:19:42.071 { 00:19:42.071 "dma_device_id": "system", 00:19:42.071 "dma_device_type": 1 00:19:42.071 }, 00:19:42.071 { 00:19:42.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.071 "dma_device_type": 2 00:19:42.071 } 00:19:42.071 ], 00:19:42.071 "driver_specific": { 00:19:42.071 "passthru": { 00:19:42.071 "name": "pt3", 00:19:42.071 "base_bdev_name": "malloc3" 00:19:42.071 } 00:19:42.071 } 00:19:42.071 }' 00:19:42.071 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.071 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.071 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:42.071 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.330 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.330 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:42.330 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.330 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.330 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:42.330 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.589 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.589 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.589 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.589 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:42.589 20:33:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.847 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.847 "name": "pt4", 00:19:42.847 "aliases": [ 00:19:42.847 "00000000-0000-0000-0000-000000000004" 00:19:42.847 ], 00:19:42.847 "product_name": "passthru", 00:19:42.847 "block_size": 512, 00:19:42.847 "num_blocks": 65536, 00:19:42.847 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:42.847 "assigned_rate_limits": { 00:19:42.847 "rw_ios_per_sec": 0, 00:19:42.847 "rw_mbytes_per_sec": 0, 00:19:42.847 "r_mbytes_per_sec": 0, 00:19:42.847 "w_mbytes_per_sec": 0 00:19:42.847 }, 00:19:42.847 "claimed": true, 00:19:42.847 "claim_type": "exclusive_write", 00:19:42.847 "zoned": false, 00:19:42.847 "supported_io_types": { 00:19:42.847 "read": true, 00:19:42.847 "write": true, 00:19:42.847 "unmap": true, 00:19:42.847 "flush": true, 00:19:42.847 "reset": true, 00:19:42.847 "nvme_admin": false, 00:19:42.847 "nvme_io": false, 00:19:42.847 "nvme_io_md": false, 00:19:42.847 "write_zeroes": true, 00:19:42.847 "zcopy": true, 00:19:42.847 "get_zone_info": false, 00:19:42.847 "zone_management": false, 00:19:42.847 "zone_append": false, 00:19:42.847 "compare": false, 00:19:42.847 "compare_and_write": false, 00:19:42.847 "abort": true, 00:19:42.847 "seek_hole": false, 00:19:42.847 "seek_data": false, 00:19:42.847 "copy": true, 00:19:42.847 "nvme_iov_md": false 00:19:42.847 }, 00:19:42.847 "memory_domains": [ 00:19:42.847 { 00:19:42.847 "dma_device_id": "system", 00:19:42.847 "dma_device_type": 1 00:19:42.847 }, 00:19:42.847 { 00:19:42.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.847 "dma_device_type": 2 00:19:42.847 } 00:19:42.847 ], 00:19:42.847 "driver_specific": { 00:19:42.847 "passthru": { 00:19:42.847 "name": "pt4", 00:19:42.847 "base_bdev_name": "malloc4" 00:19:42.847 } 00:19:42.847 } 00:19:42.847 }' 00:19:42.847 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.847 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.847 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:42.847 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.847 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.105 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.105 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.105 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.105 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.105 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.105 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.363 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.363 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:43.363 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:43.363 [2024-07-15 20:33:35.740476] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:43.621 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=6695814e-026e-4689-a827-2664888c0240 00:19:43.621 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 6695814e-026e-4689-a827-2664888c0240 ']' 00:19:43.621 20:33:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:43.879 [2024-07-15 20:33:36.241509] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:43.879 [2024-07-15 20:33:36.241534] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:43.879 [2024-07-15 20:33:36.241589] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:43.879 [2024-07-15 20:33:36.241653] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:43.879 [2024-07-15 20:33:36.241671] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ab530 name raid_bdev1, state offline 00:19:44.137 20:33:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.137 20:33:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:44.395 20:33:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:44.395 20:33:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:44.395 20:33:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:44.395 20:33:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:44.395 20:33:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:44.395 20:33:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:44.961 20:33:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:44.961 20:33:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:45.526 20:33:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:45.526 20:33:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:46.093 20:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:46.093 20:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:46.352 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:46.612 [2024-07-15 20:33:38.792126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:46.612 [2024-07-15 20:33:38.793474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:46.612 [2024-07-15 20:33:38.793516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:46.612 [2024-07-15 20:33:38.793549] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:46.612 [2024-07-15 20:33:38.793593] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:46.612 [2024-07-15 20:33:38.793632] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:46.612 [2024-07-15 20:33:38.793655] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:46.612 [2024-07-15 20:33:38.793677] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:46.612 [2024-07-15 20:33:38.793695] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:46.612 [2024-07-15 20:33:38.793706] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2256ff0 name raid_bdev1, state configuring 00:19:46.612 request: 00:19:46.612 { 00:19:46.612 "name": "raid_bdev1", 00:19:46.612 "raid_level": "raid0", 00:19:46.612 "base_bdevs": [ 00:19:46.612 "malloc1", 00:19:46.612 "malloc2", 00:19:46.612 "malloc3", 00:19:46.612 "malloc4" 00:19:46.612 ], 00:19:46.612 "strip_size_kb": 64, 00:19:46.612 "superblock": false, 00:19:46.612 "method": "bdev_raid_create", 00:19:46.612 "req_id": 1 00:19:46.612 } 00:19:46.612 Got JSON-RPC error response 00:19:46.612 response: 00:19:46.612 { 00:19:46.612 "code": -17, 00:19:46.612 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:46.612 } 00:19:46.612 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:46.612 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:46.613 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:46.613 20:33:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:46.613 20:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.613 20:33:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:47.180 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:47.180 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:47.180 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:47.180 [2024-07-15 20:33:39.554250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:47.180 [2024-07-15 20:33:39.554295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:47.180 [2024-07-15 20:33:39.554318] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20b37a0 00:19:47.180 [2024-07-15 20:33:39.554330] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:47.180 [2024-07-15 20:33:39.555918] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:47.181 [2024-07-15 20:33:39.555954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:47.181 [2024-07-15 20:33:39.556022] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:47.181 [2024-07-15 20:33:39.556049] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:47.442 pt1 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.442 20:33:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:47.768 20:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.768 "name": "raid_bdev1", 00:19:47.768 "uuid": "6695814e-026e-4689-a827-2664888c0240", 00:19:47.768 "strip_size_kb": 64, 00:19:47.768 "state": "configuring", 00:19:47.768 "raid_level": "raid0", 00:19:47.768 "superblock": true, 00:19:47.768 "num_base_bdevs": 4, 00:19:47.768 "num_base_bdevs_discovered": 1, 00:19:47.768 "num_base_bdevs_operational": 4, 00:19:47.768 "base_bdevs_list": [ 00:19:47.768 { 00:19:47.768 "name": "pt1", 00:19:47.768 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:47.768 "is_configured": true, 00:19:47.768 "data_offset": 2048, 00:19:47.768 "data_size": 63488 00:19:47.768 }, 00:19:47.768 { 00:19:47.768 "name": null, 00:19:47.768 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:47.768 "is_configured": false, 00:19:47.768 "data_offset": 2048, 00:19:47.768 "data_size": 63488 00:19:47.768 }, 00:19:47.768 { 00:19:47.768 "name": null, 00:19:47.768 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:47.768 "is_configured": false, 00:19:47.768 "data_offset": 2048, 00:19:47.768 "data_size": 63488 00:19:47.768 }, 00:19:47.768 { 00:19:47.768 "name": null, 00:19:47.768 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:47.768 "is_configured": false, 00:19:47.768 "data_offset": 2048, 00:19:47.768 "data_size": 63488 00:19:47.768 } 00:19:47.768 ] 00:19:47.768 }' 00:19:47.768 20:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.768 20:33:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:48.336 20:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:19:48.336 20:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:48.595 [2024-07-15 20:33:40.909864] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:48.595 [2024-07-15 20:33:40.909913] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:48.595 [2024-07-15 20:33:40.909941] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x224c940 00:19:48.595 [2024-07-15 20:33:40.909954] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:48.595 [2024-07-15 20:33:40.910297] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:48.595 [2024-07-15 20:33:40.910314] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:48.595 [2024-07-15 20:33:40.910376] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:48.595 [2024-07-15 20:33:40.910396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:48.595 pt2 00:19:48.595 20:33:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:48.855 [2024-07-15 20:33:41.090339] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.855 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:49.114 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:49.114 "name": "raid_bdev1", 00:19:49.114 "uuid": "6695814e-026e-4689-a827-2664888c0240", 00:19:49.114 "strip_size_kb": 64, 00:19:49.114 "state": "configuring", 00:19:49.114 "raid_level": "raid0", 00:19:49.114 "superblock": true, 00:19:49.114 "num_base_bdevs": 4, 00:19:49.114 "num_base_bdevs_discovered": 1, 00:19:49.114 "num_base_bdevs_operational": 4, 00:19:49.114 "base_bdevs_list": [ 00:19:49.114 { 00:19:49.114 "name": "pt1", 00:19:49.114 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:49.114 "is_configured": true, 00:19:49.114 "data_offset": 2048, 00:19:49.114 "data_size": 63488 00:19:49.114 }, 00:19:49.114 { 00:19:49.115 "name": null, 00:19:49.115 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:49.115 "is_configured": false, 00:19:49.115 "data_offset": 2048, 00:19:49.115 "data_size": 63488 00:19:49.115 }, 00:19:49.115 { 00:19:49.115 "name": null, 00:19:49.115 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:49.115 "is_configured": false, 00:19:49.115 "data_offset": 2048, 00:19:49.115 "data_size": 63488 00:19:49.115 }, 00:19:49.115 { 00:19:49.115 "name": null, 00:19:49.115 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:49.115 "is_configured": false, 00:19:49.115 "data_offset": 2048, 00:19:49.115 "data_size": 63488 00:19:49.115 } 00:19:49.115 ] 00:19:49.115 }' 00:19:49.115 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:49.115 20:33:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.683 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:49.683 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:49.683 20:33:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:49.942 [2024-07-15 20:33:42.076944] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:49.942 [2024-07-15 20:33:42.076991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:49.942 [2024-07-15 20:33:42.077009] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20aa060 00:19:49.942 [2024-07-15 20:33:42.077022] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:49.942 [2024-07-15 20:33:42.077357] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:49.942 [2024-07-15 20:33:42.077374] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:49.942 [2024-07-15 20:33:42.077434] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:49.942 [2024-07-15 20:33:42.077453] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:49.942 pt2 00:19:49.942 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:49.942 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:49.942 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:50.510 [2024-07-15 20:33:42.586322] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:50.510 [2024-07-15 20:33:42.586365] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:50.510 [2024-07-15 20:33:42.586386] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20ac8d0 00:19:50.510 [2024-07-15 20:33:42.586398] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:50.510 [2024-07-15 20:33:42.586725] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:50.510 [2024-07-15 20:33:42.586747] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:50.510 [2024-07-15 20:33:42.586808] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:50.510 [2024-07-15 20:33:42.586827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:50.510 pt3 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:50.510 [2024-07-15 20:33:42.847013] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:50.510 [2024-07-15 20:33:42.847047] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:50.510 [2024-07-15 20:33:42.847063] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20adb80 00:19:50.510 [2024-07-15 20:33:42.847074] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:50.510 [2024-07-15 20:33:42.847377] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:50.510 [2024-07-15 20:33:42.847393] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:50.510 [2024-07-15 20:33:42.847449] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:50.510 [2024-07-15 20:33:42.847467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:50.510 [2024-07-15 20:33:42.847585] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20aa780 00:19:50.510 [2024-07-15 20:33:42.847595] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:50.510 [2024-07-15 20:33:42.847763] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20afd70 00:19:50.510 [2024-07-15 20:33:42.847890] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20aa780 00:19:50.510 [2024-07-15 20:33:42.847900] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20aa780 00:19:50.510 [2024-07-15 20:33:42.848005] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:50.510 pt4 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.510 20:33:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:50.770 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.770 "name": "raid_bdev1", 00:19:50.770 "uuid": "6695814e-026e-4689-a827-2664888c0240", 00:19:50.770 "strip_size_kb": 64, 00:19:50.770 "state": "online", 00:19:50.770 "raid_level": "raid0", 00:19:50.770 "superblock": true, 00:19:50.770 "num_base_bdevs": 4, 00:19:50.770 "num_base_bdevs_discovered": 4, 00:19:50.770 "num_base_bdevs_operational": 4, 00:19:50.770 "base_bdevs_list": [ 00:19:50.770 { 00:19:50.770 "name": "pt1", 00:19:50.770 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:50.770 "is_configured": true, 00:19:50.770 "data_offset": 2048, 00:19:50.770 "data_size": 63488 00:19:50.770 }, 00:19:50.770 { 00:19:50.770 "name": "pt2", 00:19:50.770 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:50.770 "is_configured": true, 00:19:50.771 "data_offset": 2048, 00:19:50.771 "data_size": 63488 00:19:50.771 }, 00:19:50.771 { 00:19:50.771 "name": "pt3", 00:19:50.771 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:50.771 "is_configured": true, 00:19:50.771 "data_offset": 2048, 00:19:50.771 "data_size": 63488 00:19:50.771 }, 00:19:50.771 { 00:19:50.771 "name": "pt4", 00:19:50.771 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:50.771 "is_configured": true, 00:19:50.771 "data_offset": 2048, 00:19:50.771 "data_size": 63488 00:19:50.771 } 00:19:50.771 ] 00:19:50.771 }' 00:19:50.771 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.771 20:33:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:51.339 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:51.339 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:51.339 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:51.339 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:51.339 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:51.339 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:51.339 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:51.339 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:51.596 [2024-07-15 20:33:43.934238] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:51.597 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:51.597 "name": "raid_bdev1", 00:19:51.597 "aliases": [ 00:19:51.597 "6695814e-026e-4689-a827-2664888c0240" 00:19:51.597 ], 00:19:51.597 "product_name": "Raid Volume", 00:19:51.597 "block_size": 512, 00:19:51.597 "num_blocks": 253952, 00:19:51.597 "uuid": "6695814e-026e-4689-a827-2664888c0240", 00:19:51.597 "assigned_rate_limits": { 00:19:51.597 "rw_ios_per_sec": 0, 00:19:51.597 "rw_mbytes_per_sec": 0, 00:19:51.597 "r_mbytes_per_sec": 0, 00:19:51.597 "w_mbytes_per_sec": 0 00:19:51.597 }, 00:19:51.597 "claimed": false, 00:19:51.597 "zoned": false, 00:19:51.597 "supported_io_types": { 00:19:51.597 "read": true, 00:19:51.597 "write": true, 00:19:51.597 "unmap": true, 00:19:51.597 "flush": true, 00:19:51.597 "reset": true, 00:19:51.597 "nvme_admin": false, 00:19:51.597 "nvme_io": false, 00:19:51.597 "nvme_io_md": false, 00:19:51.597 "write_zeroes": true, 00:19:51.597 "zcopy": false, 00:19:51.597 "get_zone_info": false, 00:19:51.597 "zone_management": false, 00:19:51.597 "zone_append": false, 00:19:51.597 "compare": false, 00:19:51.597 "compare_and_write": false, 00:19:51.597 "abort": false, 00:19:51.597 "seek_hole": false, 00:19:51.597 "seek_data": false, 00:19:51.597 "copy": false, 00:19:51.597 "nvme_iov_md": false 00:19:51.597 }, 00:19:51.597 "memory_domains": [ 00:19:51.597 { 00:19:51.597 "dma_device_id": "system", 00:19:51.597 "dma_device_type": 1 00:19:51.597 }, 00:19:51.597 { 00:19:51.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.597 "dma_device_type": 2 00:19:51.597 }, 00:19:51.597 { 00:19:51.597 "dma_device_id": "system", 00:19:51.597 "dma_device_type": 1 00:19:51.597 }, 00:19:51.597 { 00:19:51.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.597 "dma_device_type": 2 00:19:51.597 }, 00:19:51.597 { 00:19:51.597 "dma_device_id": "system", 00:19:51.597 "dma_device_type": 1 00:19:51.597 }, 00:19:51.597 { 00:19:51.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.597 "dma_device_type": 2 00:19:51.597 }, 00:19:51.597 { 00:19:51.597 "dma_device_id": "system", 00:19:51.597 "dma_device_type": 1 00:19:51.597 }, 00:19:51.597 { 00:19:51.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.597 "dma_device_type": 2 00:19:51.597 } 00:19:51.597 ], 00:19:51.597 "driver_specific": { 00:19:51.597 "raid": { 00:19:51.597 "uuid": "6695814e-026e-4689-a827-2664888c0240", 00:19:51.597 "strip_size_kb": 64, 00:19:51.597 "state": "online", 00:19:51.597 "raid_level": "raid0", 00:19:51.597 "superblock": true, 00:19:51.597 "num_base_bdevs": 4, 00:19:51.597 "num_base_bdevs_discovered": 4, 00:19:51.597 "num_base_bdevs_operational": 4, 00:19:51.597 "base_bdevs_list": [ 00:19:51.597 { 00:19:51.597 "name": "pt1", 00:19:51.597 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:51.597 "is_configured": true, 00:19:51.597 "data_offset": 2048, 00:19:51.597 "data_size": 63488 00:19:51.597 }, 00:19:51.597 { 00:19:51.597 "name": "pt2", 00:19:51.597 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:51.597 "is_configured": true, 00:19:51.597 "data_offset": 2048, 00:19:51.597 "data_size": 63488 00:19:51.597 }, 00:19:51.597 { 00:19:51.597 "name": "pt3", 00:19:51.597 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:51.597 "is_configured": true, 00:19:51.597 "data_offset": 2048, 00:19:51.597 "data_size": 63488 00:19:51.597 }, 00:19:51.597 { 00:19:51.597 "name": "pt4", 00:19:51.597 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:51.597 "is_configured": true, 00:19:51.597 "data_offset": 2048, 00:19:51.597 "data_size": 63488 00:19:51.597 } 00:19:51.597 ] 00:19:51.597 } 00:19:51.597 } 00:19:51.597 }' 00:19:51.597 20:33:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:51.855 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:51.855 pt2 00:19:51.855 pt3 00:19:51.855 pt4' 00:19:51.855 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:51.855 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:51.855 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:52.114 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:52.114 "name": "pt1", 00:19:52.114 "aliases": [ 00:19:52.114 "00000000-0000-0000-0000-000000000001" 00:19:52.114 ], 00:19:52.114 "product_name": "passthru", 00:19:52.114 "block_size": 512, 00:19:52.114 "num_blocks": 65536, 00:19:52.114 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:52.114 "assigned_rate_limits": { 00:19:52.114 "rw_ios_per_sec": 0, 00:19:52.114 "rw_mbytes_per_sec": 0, 00:19:52.114 "r_mbytes_per_sec": 0, 00:19:52.114 "w_mbytes_per_sec": 0 00:19:52.114 }, 00:19:52.114 "claimed": true, 00:19:52.114 "claim_type": "exclusive_write", 00:19:52.114 "zoned": false, 00:19:52.114 "supported_io_types": { 00:19:52.114 "read": true, 00:19:52.114 "write": true, 00:19:52.114 "unmap": true, 00:19:52.114 "flush": true, 00:19:52.114 "reset": true, 00:19:52.114 "nvme_admin": false, 00:19:52.114 "nvme_io": false, 00:19:52.114 "nvme_io_md": false, 00:19:52.114 "write_zeroes": true, 00:19:52.114 "zcopy": true, 00:19:52.114 "get_zone_info": false, 00:19:52.114 "zone_management": false, 00:19:52.114 "zone_append": false, 00:19:52.114 "compare": false, 00:19:52.114 "compare_and_write": false, 00:19:52.114 "abort": true, 00:19:52.114 "seek_hole": false, 00:19:52.114 "seek_data": false, 00:19:52.114 "copy": true, 00:19:52.114 "nvme_iov_md": false 00:19:52.114 }, 00:19:52.114 "memory_domains": [ 00:19:52.114 { 00:19:52.114 "dma_device_id": "system", 00:19:52.114 "dma_device_type": 1 00:19:52.114 }, 00:19:52.114 { 00:19:52.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.114 "dma_device_type": 2 00:19:52.114 } 00:19:52.114 ], 00:19:52.114 "driver_specific": { 00:19:52.114 "passthru": { 00:19:52.114 "name": "pt1", 00:19:52.114 "base_bdev_name": "malloc1" 00:19:52.114 } 00:19:52.114 } 00:19:52.114 }' 00:19:52.114 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:52.114 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:52.114 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:52.114 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:52.114 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:52.114 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:52.114 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.114 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.373 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:52.373 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.373 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.373 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:52.373 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:52.373 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:52.373 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:52.632 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:52.632 "name": "pt2", 00:19:52.632 "aliases": [ 00:19:52.632 "00000000-0000-0000-0000-000000000002" 00:19:52.632 ], 00:19:52.632 "product_name": "passthru", 00:19:52.632 "block_size": 512, 00:19:52.632 "num_blocks": 65536, 00:19:52.632 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:52.632 "assigned_rate_limits": { 00:19:52.632 "rw_ios_per_sec": 0, 00:19:52.632 "rw_mbytes_per_sec": 0, 00:19:52.632 "r_mbytes_per_sec": 0, 00:19:52.632 "w_mbytes_per_sec": 0 00:19:52.632 }, 00:19:52.632 "claimed": true, 00:19:52.632 "claim_type": "exclusive_write", 00:19:52.632 "zoned": false, 00:19:52.632 "supported_io_types": { 00:19:52.632 "read": true, 00:19:52.632 "write": true, 00:19:52.632 "unmap": true, 00:19:52.632 "flush": true, 00:19:52.632 "reset": true, 00:19:52.632 "nvme_admin": false, 00:19:52.632 "nvme_io": false, 00:19:52.632 "nvme_io_md": false, 00:19:52.632 "write_zeroes": true, 00:19:52.632 "zcopy": true, 00:19:52.632 "get_zone_info": false, 00:19:52.632 "zone_management": false, 00:19:52.632 "zone_append": false, 00:19:52.632 "compare": false, 00:19:52.632 "compare_and_write": false, 00:19:52.632 "abort": true, 00:19:52.632 "seek_hole": false, 00:19:52.632 "seek_data": false, 00:19:52.632 "copy": true, 00:19:52.632 "nvme_iov_md": false 00:19:52.632 }, 00:19:52.632 "memory_domains": [ 00:19:52.632 { 00:19:52.632 "dma_device_id": "system", 00:19:52.632 "dma_device_type": 1 00:19:52.632 }, 00:19:52.632 { 00:19:52.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.632 "dma_device_type": 2 00:19:52.632 } 00:19:52.632 ], 00:19:52.632 "driver_specific": { 00:19:52.632 "passthru": { 00:19:52.632 "name": "pt2", 00:19:52.632 "base_bdev_name": "malloc2" 00:19:52.632 } 00:19:52.632 } 00:19:52.632 }' 00:19:52.632 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:52.632 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:52.632 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:52.632 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:52.632 20:33:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:52.891 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:52.891 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.891 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:52.891 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:52.891 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.891 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:52.891 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:52.891 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:52.892 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:52.892 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:53.151 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:53.151 "name": "pt3", 00:19:53.151 "aliases": [ 00:19:53.151 "00000000-0000-0000-0000-000000000003" 00:19:53.151 ], 00:19:53.151 "product_name": "passthru", 00:19:53.151 "block_size": 512, 00:19:53.151 "num_blocks": 65536, 00:19:53.151 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:53.151 "assigned_rate_limits": { 00:19:53.151 "rw_ios_per_sec": 0, 00:19:53.151 "rw_mbytes_per_sec": 0, 00:19:53.151 "r_mbytes_per_sec": 0, 00:19:53.151 "w_mbytes_per_sec": 0 00:19:53.151 }, 00:19:53.151 "claimed": true, 00:19:53.151 "claim_type": "exclusive_write", 00:19:53.151 "zoned": false, 00:19:53.151 "supported_io_types": { 00:19:53.151 "read": true, 00:19:53.151 "write": true, 00:19:53.151 "unmap": true, 00:19:53.151 "flush": true, 00:19:53.151 "reset": true, 00:19:53.151 "nvme_admin": false, 00:19:53.151 "nvme_io": false, 00:19:53.151 "nvme_io_md": false, 00:19:53.151 "write_zeroes": true, 00:19:53.151 "zcopy": true, 00:19:53.151 "get_zone_info": false, 00:19:53.151 "zone_management": false, 00:19:53.151 "zone_append": false, 00:19:53.151 "compare": false, 00:19:53.151 "compare_and_write": false, 00:19:53.151 "abort": true, 00:19:53.151 "seek_hole": false, 00:19:53.151 "seek_data": false, 00:19:53.151 "copy": true, 00:19:53.151 "nvme_iov_md": false 00:19:53.151 }, 00:19:53.151 "memory_domains": [ 00:19:53.151 { 00:19:53.151 "dma_device_id": "system", 00:19:53.151 "dma_device_type": 1 00:19:53.151 }, 00:19:53.151 { 00:19:53.151 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.151 "dma_device_type": 2 00:19:53.151 } 00:19:53.151 ], 00:19:53.151 "driver_specific": { 00:19:53.151 "passthru": { 00:19:53.151 "name": "pt3", 00:19:53.151 "base_bdev_name": "malloc3" 00:19:53.151 } 00:19:53.151 } 00:19:53.151 }' 00:19:53.151 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.151 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.151 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:53.151 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:53.435 20:33:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:53.694 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:53.694 "name": "pt4", 00:19:53.694 "aliases": [ 00:19:53.694 "00000000-0000-0000-0000-000000000004" 00:19:53.694 ], 00:19:53.694 "product_name": "passthru", 00:19:53.694 "block_size": 512, 00:19:53.694 "num_blocks": 65536, 00:19:53.694 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:53.694 "assigned_rate_limits": { 00:19:53.694 "rw_ios_per_sec": 0, 00:19:53.694 "rw_mbytes_per_sec": 0, 00:19:53.694 "r_mbytes_per_sec": 0, 00:19:53.694 "w_mbytes_per_sec": 0 00:19:53.694 }, 00:19:53.694 "claimed": true, 00:19:53.694 "claim_type": "exclusive_write", 00:19:53.694 "zoned": false, 00:19:53.694 "supported_io_types": { 00:19:53.694 "read": true, 00:19:53.694 "write": true, 00:19:53.694 "unmap": true, 00:19:53.694 "flush": true, 00:19:53.694 "reset": true, 00:19:53.694 "nvme_admin": false, 00:19:53.694 "nvme_io": false, 00:19:53.694 "nvme_io_md": false, 00:19:53.694 "write_zeroes": true, 00:19:53.694 "zcopy": true, 00:19:53.694 "get_zone_info": false, 00:19:53.694 "zone_management": false, 00:19:53.694 "zone_append": false, 00:19:53.694 "compare": false, 00:19:53.694 "compare_and_write": false, 00:19:53.694 "abort": true, 00:19:53.694 "seek_hole": false, 00:19:53.694 "seek_data": false, 00:19:53.694 "copy": true, 00:19:53.694 "nvme_iov_md": false 00:19:53.694 }, 00:19:53.694 "memory_domains": [ 00:19:53.694 { 00:19:53.694 "dma_device_id": "system", 00:19:53.694 "dma_device_type": 1 00:19:53.694 }, 00:19:53.694 { 00:19:53.694 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.694 "dma_device_type": 2 00:19:53.694 } 00:19:53.694 ], 00:19:53.694 "driver_specific": { 00:19:53.694 "passthru": { 00:19:53.694 "name": "pt4", 00:19:53.694 "base_bdev_name": "malloc4" 00:19:53.694 } 00:19:53.694 } 00:19:53.694 }' 00:19:53.694 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.952 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.952 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:53.952 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.952 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:53.952 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:53.952 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.952 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:53.952 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:53.952 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.212 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.212 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:54.212 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:54.212 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:54.471 [2024-07-15 20:33:46.613342] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 6695814e-026e-4689-a827-2664888c0240 '!=' 6695814e-026e-4689-a827-2664888c0240 ']' 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1423311 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1423311 ']' 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1423311 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1423311 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1423311' 00:19:54.471 killing process with pid 1423311 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1423311 00:19:54.471 [2024-07-15 20:33:46.681636] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:54.471 [2024-07-15 20:33:46.681694] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:54.471 [2024-07-15 20:33:46.681755] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:54.471 [2024-07-15 20:33:46.681769] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20aa780 name raid_bdev1, state offline 00:19:54.471 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1423311 00:19:54.471 [2024-07-15 20:33:46.717850] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:54.731 20:33:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:54.731 00:19:54.731 real 0m20.877s 00:19:54.731 user 0m38.035s 00:19:54.731 sys 0m3.431s 00:19:54.731 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:54.731 20:33:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:54.731 ************************************ 00:19:54.731 END TEST raid_superblock_test 00:19:54.731 ************************************ 00:19:54.731 20:33:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:54.731 20:33:46 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:19:54.731 20:33:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:54.731 20:33:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:54.731 20:33:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:54.731 ************************************ 00:19:54.731 START TEST raid_read_error_test 00:19:54.731 ************************************ 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:54.731 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.6SNcZC825R 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1426273 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1426273 /var/tmp/spdk-raid.sock 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1426273 ']' 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:54.732 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:54.732 20:33:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:54.732 [2024-07-15 20:33:47.081427] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:19:54.732 [2024-07-15 20:33:47.081492] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1426273 ] 00:19:54.990 [2024-07-15 20:33:47.212145] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:54.990 [2024-07-15 20:33:47.318310] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:55.248 [2024-07-15 20:33:47.385824] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:55.248 [2024-07-15 20:33:47.385859] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:55.813 20:33:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:55.813 20:33:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:55.813 20:33:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:55.813 20:33:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:56.072 BaseBdev1_malloc 00:19:56.072 20:33:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:56.330 true 00:19:56.330 20:33:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:56.588 [2024-07-15 20:33:48.721431] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:56.588 [2024-07-15 20:33:48.721473] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:56.588 [2024-07-15 20:33:48.721495] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21420d0 00:19:56.588 [2024-07-15 20:33:48.721509] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:56.588 [2024-07-15 20:33:48.723334] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:56.588 [2024-07-15 20:33:48.723362] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:56.588 BaseBdev1 00:19:56.588 20:33:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:56.588 20:33:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:56.588 BaseBdev2_malloc 00:19:56.846 20:33:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:56.847 true 00:19:56.847 20:33:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:57.105 [2024-07-15 20:33:49.449200] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:57.105 [2024-07-15 20:33:49.449244] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:57.105 [2024-07-15 20:33:49.449267] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2146910 00:19:57.105 [2024-07-15 20:33:49.449280] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:57.105 [2024-07-15 20:33:49.450877] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:57.105 [2024-07-15 20:33:49.450903] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:57.105 BaseBdev2 00:19:57.105 20:33:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:57.105 20:33:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:57.364 BaseBdev3_malloc 00:19:57.364 20:33:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:57.622 true 00:19:57.622 20:33:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:57.882 [2024-07-15 20:33:50.185032] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:57.882 [2024-07-15 20:33:50.185079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:57.882 [2024-07-15 20:33:50.185101] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2148bd0 00:19:57.882 [2024-07-15 20:33:50.185114] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:57.882 [2024-07-15 20:33:50.186691] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:57.882 [2024-07-15 20:33:50.186720] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:57.882 BaseBdev3 00:19:57.882 20:33:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:57.882 20:33:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:58.141 BaseBdev4_malloc 00:19:58.141 20:33:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:58.399 true 00:19:58.399 20:33:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:58.658 [2024-07-15 20:33:50.919525] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:58.658 [2024-07-15 20:33:50.919569] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:58.658 [2024-07-15 20:33:50.919593] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2149aa0 00:19:58.658 [2024-07-15 20:33:50.919606] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:58.658 [2024-07-15 20:33:50.921207] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:58.658 [2024-07-15 20:33:50.921235] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:58.658 BaseBdev4 00:19:58.658 20:33:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:58.916 [2024-07-15 20:33:51.152182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:58.916 [2024-07-15 20:33:51.153571] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:58.916 [2024-07-15 20:33:51.153640] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:58.916 [2024-07-15 20:33:51.153701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:58.916 [2024-07-15 20:33:51.153942] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2143c20 00:19:58.916 [2024-07-15 20:33:51.153954] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:58.916 [2024-07-15 20:33:51.154159] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f98260 00:19:58.916 [2024-07-15 20:33:51.154313] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2143c20 00:19:58.916 [2024-07-15 20:33:51.154323] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2143c20 00:19:58.916 [2024-07-15 20:33:51.154430] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.916 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.174 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.174 "name": "raid_bdev1", 00:19:59.174 "uuid": "1bfede80-e669-4ec8-b747-9401a2224441", 00:19:59.174 "strip_size_kb": 64, 00:19:59.174 "state": "online", 00:19:59.174 "raid_level": "raid0", 00:19:59.174 "superblock": true, 00:19:59.174 "num_base_bdevs": 4, 00:19:59.174 "num_base_bdevs_discovered": 4, 00:19:59.174 "num_base_bdevs_operational": 4, 00:19:59.174 "base_bdevs_list": [ 00:19:59.174 { 00:19:59.174 "name": "BaseBdev1", 00:19:59.174 "uuid": "d42aa26d-4e9f-53dc-bb6c-9c00f1666147", 00:19:59.174 "is_configured": true, 00:19:59.174 "data_offset": 2048, 00:19:59.174 "data_size": 63488 00:19:59.174 }, 00:19:59.174 { 00:19:59.174 "name": "BaseBdev2", 00:19:59.174 "uuid": "0fb44753-4255-527c-9d8e-0253e2d31ab2", 00:19:59.174 "is_configured": true, 00:19:59.174 "data_offset": 2048, 00:19:59.174 "data_size": 63488 00:19:59.174 }, 00:19:59.174 { 00:19:59.174 "name": "BaseBdev3", 00:19:59.174 "uuid": "6774ad0d-acc7-56d5-b71c-385ab971a594", 00:19:59.174 "is_configured": true, 00:19:59.174 "data_offset": 2048, 00:19:59.174 "data_size": 63488 00:19:59.174 }, 00:19:59.174 { 00:19:59.174 "name": "BaseBdev4", 00:19:59.174 "uuid": "fd327e9d-2aab-53c8-946b-da66201037c0", 00:19:59.174 "is_configured": true, 00:19:59.174 "data_offset": 2048, 00:19:59.174 "data_size": 63488 00:19:59.174 } 00:19:59.174 ] 00:19:59.174 }' 00:19:59.174 20:33:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.174 20:33:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:59.741 20:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:59.741 20:33:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:00.000 [2024-07-15 20:33:52.155108] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2135fc0 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.932 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:01.189 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:01.189 "name": "raid_bdev1", 00:20:01.189 "uuid": "1bfede80-e669-4ec8-b747-9401a2224441", 00:20:01.189 "strip_size_kb": 64, 00:20:01.189 "state": "online", 00:20:01.189 "raid_level": "raid0", 00:20:01.189 "superblock": true, 00:20:01.189 "num_base_bdevs": 4, 00:20:01.189 "num_base_bdevs_discovered": 4, 00:20:01.189 "num_base_bdevs_operational": 4, 00:20:01.189 "base_bdevs_list": [ 00:20:01.189 { 00:20:01.189 "name": "BaseBdev1", 00:20:01.189 "uuid": "d42aa26d-4e9f-53dc-bb6c-9c00f1666147", 00:20:01.189 "is_configured": true, 00:20:01.189 "data_offset": 2048, 00:20:01.189 "data_size": 63488 00:20:01.189 }, 00:20:01.189 { 00:20:01.189 "name": "BaseBdev2", 00:20:01.189 "uuid": "0fb44753-4255-527c-9d8e-0253e2d31ab2", 00:20:01.189 "is_configured": true, 00:20:01.189 "data_offset": 2048, 00:20:01.189 "data_size": 63488 00:20:01.189 }, 00:20:01.189 { 00:20:01.189 "name": "BaseBdev3", 00:20:01.189 "uuid": "6774ad0d-acc7-56d5-b71c-385ab971a594", 00:20:01.189 "is_configured": true, 00:20:01.189 "data_offset": 2048, 00:20:01.189 "data_size": 63488 00:20:01.189 }, 00:20:01.189 { 00:20:01.189 "name": "BaseBdev4", 00:20:01.189 "uuid": "fd327e9d-2aab-53c8-946b-da66201037c0", 00:20:01.189 "is_configured": true, 00:20:01.189 "data_offset": 2048, 00:20:01.189 "data_size": 63488 00:20:01.189 } 00:20:01.189 ] 00:20:01.189 }' 00:20:01.189 20:33:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:01.189 20:33:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.143 20:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:02.143 [2024-07-15 20:33:54.473279] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:02.143 [2024-07-15 20:33:54.473318] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:02.143 [2024-07-15 20:33:54.476487] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:02.143 [2024-07-15 20:33:54.476526] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:02.143 [2024-07-15 20:33:54.476567] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:02.143 [2024-07-15 20:33:54.476579] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2143c20 name raid_bdev1, state offline 00:20:02.143 0 00:20:02.143 20:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1426273 00:20:02.143 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1426273 ']' 00:20:02.143 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1426273 00:20:02.143 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:02.143 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:02.143 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1426273 00:20:02.402 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:02.402 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:02.402 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1426273' 00:20:02.402 killing process with pid 1426273 00:20:02.402 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1426273 00:20:02.402 [2024-07-15 20:33:54.544769] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:02.402 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1426273 00:20:02.402 [2024-07-15 20:33:54.579494] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:02.660 20:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.6SNcZC825R 00:20:02.660 20:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:02.660 20:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:02.660 20:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.43 00:20:02.660 20:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:20:02.661 20:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:02.661 20:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:02.661 20:33:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.43 != \0\.\0\0 ]] 00:20:02.661 00:20:02.661 real 0m7.817s 00:20:02.661 user 0m12.532s 00:20:02.661 sys 0m1.375s 00:20:02.661 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:02.661 20:33:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.661 ************************************ 00:20:02.661 END TEST raid_read_error_test 00:20:02.661 ************************************ 00:20:02.661 20:33:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:02.661 20:33:54 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:20:02.661 20:33:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:02.661 20:33:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:02.661 20:33:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:02.661 ************************************ 00:20:02.661 START TEST raid_write_error_test 00:20:02.661 ************************************ 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.oaWWIYa8Rj 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1427418 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1427418 /var/tmp/spdk-raid.sock 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1427418 ']' 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:02.661 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:02.661 20:33:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.661 [2024-07-15 20:33:55.028158] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:20:02.661 [2024-07-15 20:33:55.028303] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1427418 ] 00:20:02.920 [2024-07-15 20:33:55.225484] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:03.177 [2024-07-15 20:33:55.328451] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:03.177 [2024-07-15 20:33:55.391982] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:03.177 [2024-07-15 20:33:55.392017] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:03.743 20:33:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:03.743 20:33:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:03.743 20:33:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:03.743 20:33:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:04.001 BaseBdev1_malloc 00:20:04.001 20:33:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:04.001 true 00:20:04.260 20:33:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:04.260 [2024-07-15 20:33:56.618516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:04.260 [2024-07-15 20:33:56.618559] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:04.260 [2024-07-15 20:33:56.618581] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb20d0 00:20:04.260 [2024-07-15 20:33:56.618594] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:04.260 [2024-07-15 20:33:56.620481] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:04.260 [2024-07-15 20:33:56.620510] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:04.260 BaseBdev1 00:20:04.260 20:33:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:04.260 20:33:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:04.519 BaseBdev2_malloc 00:20:04.519 20:33:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:04.777 true 00:20:04.777 20:33:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:05.037 [2024-07-15 20:33:57.377011] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:05.037 [2024-07-15 20:33:57.377054] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:05.037 [2024-07-15 20:33:57.377076] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb6910 00:20:05.037 [2024-07-15 20:33:57.377089] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:05.037 [2024-07-15 20:33:57.378683] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:05.037 [2024-07-15 20:33:57.378711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:05.037 BaseBdev2 00:20:05.037 20:33:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:05.037 20:33:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:05.296 BaseBdev3_malloc 00:20:05.296 20:33:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:05.555 true 00:20:05.555 20:33:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:05.814 [2024-07-15 20:33:58.111527] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:05.814 [2024-07-15 20:33:58.111571] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:05.814 [2024-07-15 20:33:58.111590] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb8bd0 00:20:05.814 [2024-07-15 20:33:58.111603] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:05.814 [2024-07-15 20:33:58.113175] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:05.814 [2024-07-15 20:33:58.113203] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:05.814 BaseBdev3 00:20:05.814 20:33:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:05.814 20:33:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:06.073 BaseBdev4_malloc 00:20:06.074 20:33:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:06.333 true 00:20:06.333 20:33:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:06.592 [2024-07-15 20:33:58.855303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:06.592 [2024-07-15 20:33:58.855346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:06.592 [2024-07-15 20:33:58.855368] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb9aa0 00:20:06.592 [2024-07-15 20:33:58.855380] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:06.592 [2024-07-15 20:33:58.856979] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:06.592 [2024-07-15 20:33:58.857006] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:06.592 BaseBdev4 00:20:06.592 20:33:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:06.851 [2024-07-15 20:33:59.099989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:06.851 [2024-07-15 20:33:59.101363] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:06.851 [2024-07-15 20:33:59.101430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:06.851 [2024-07-15 20:33:59.101493] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:06.851 [2024-07-15 20:33:59.101725] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfb3c20 00:20:06.851 [2024-07-15 20:33:59.101737] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:06.851 [2024-07-15 20:33:59.101948] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe08260 00:20:06.851 [2024-07-15 20:33:59.102101] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfb3c20 00:20:06.851 [2024-07-15 20:33:59.102111] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfb3c20 00:20:06.851 [2024-07-15 20:33:59.102217] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.851 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:07.111 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.111 "name": "raid_bdev1", 00:20:07.111 "uuid": "6186db1a-cd04-4a90-83e0-c8190944f5d2", 00:20:07.111 "strip_size_kb": 64, 00:20:07.111 "state": "online", 00:20:07.111 "raid_level": "raid0", 00:20:07.111 "superblock": true, 00:20:07.111 "num_base_bdevs": 4, 00:20:07.111 "num_base_bdevs_discovered": 4, 00:20:07.111 "num_base_bdevs_operational": 4, 00:20:07.111 "base_bdevs_list": [ 00:20:07.111 { 00:20:07.111 "name": "BaseBdev1", 00:20:07.111 "uuid": "3b4374a3-2416-5b74-b444-78304bcc8307", 00:20:07.111 "is_configured": true, 00:20:07.111 "data_offset": 2048, 00:20:07.111 "data_size": 63488 00:20:07.111 }, 00:20:07.111 { 00:20:07.111 "name": "BaseBdev2", 00:20:07.111 "uuid": "af548776-6961-50a7-9158-44447d3b73e5", 00:20:07.111 "is_configured": true, 00:20:07.111 "data_offset": 2048, 00:20:07.111 "data_size": 63488 00:20:07.111 }, 00:20:07.111 { 00:20:07.111 "name": "BaseBdev3", 00:20:07.111 "uuid": "30552200-bcc9-531b-b1c3-c2e8066b87ee", 00:20:07.111 "is_configured": true, 00:20:07.111 "data_offset": 2048, 00:20:07.111 "data_size": 63488 00:20:07.111 }, 00:20:07.111 { 00:20:07.111 "name": "BaseBdev4", 00:20:07.111 "uuid": "adb49b45-11be-5705-8f36-dd7028e3f284", 00:20:07.111 "is_configured": true, 00:20:07.111 "data_offset": 2048, 00:20:07.111 "data_size": 63488 00:20:07.111 } 00:20:07.111 ] 00:20:07.111 }' 00:20:07.111 20:33:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.111 20:33:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:07.679 20:34:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:07.679 20:34:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:07.947 [2024-07-15 20:34:00.155055] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfa5fc0 00:20:08.886 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.145 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:09.404 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:09.404 "name": "raid_bdev1", 00:20:09.404 "uuid": "6186db1a-cd04-4a90-83e0-c8190944f5d2", 00:20:09.404 "strip_size_kb": 64, 00:20:09.404 "state": "online", 00:20:09.404 "raid_level": "raid0", 00:20:09.404 "superblock": true, 00:20:09.404 "num_base_bdevs": 4, 00:20:09.404 "num_base_bdevs_discovered": 4, 00:20:09.404 "num_base_bdevs_operational": 4, 00:20:09.404 "base_bdevs_list": [ 00:20:09.404 { 00:20:09.404 "name": "BaseBdev1", 00:20:09.404 "uuid": "3b4374a3-2416-5b74-b444-78304bcc8307", 00:20:09.404 "is_configured": true, 00:20:09.404 "data_offset": 2048, 00:20:09.404 "data_size": 63488 00:20:09.404 }, 00:20:09.404 { 00:20:09.404 "name": "BaseBdev2", 00:20:09.404 "uuid": "af548776-6961-50a7-9158-44447d3b73e5", 00:20:09.404 "is_configured": true, 00:20:09.404 "data_offset": 2048, 00:20:09.404 "data_size": 63488 00:20:09.404 }, 00:20:09.404 { 00:20:09.404 "name": "BaseBdev3", 00:20:09.404 "uuid": "30552200-bcc9-531b-b1c3-c2e8066b87ee", 00:20:09.404 "is_configured": true, 00:20:09.404 "data_offset": 2048, 00:20:09.404 "data_size": 63488 00:20:09.404 }, 00:20:09.404 { 00:20:09.404 "name": "BaseBdev4", 00:20:09.404 "uuid": "adb49b45-11be-5705-8f36-dd7028e3f284", 00:20:09.404 "is_configured": true, 00:20:09.404 "data_offset": 2048, 00:20:09.404 "data_size": 63488 00:20:09.404 } 00:20:09.404 ] 00:20:09.404 }' 00:20:09.404 20:34:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:09.404 20:34:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:09.973 20:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:10.232 [2024-07-15 20:34:02.384751] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:10.232 [2024-07-15 20:34:02.384789] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:10.232 [2024-07-15 20:34:02.387947] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:10.233 [2024-07-15 20:34:02.387987] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:10.233 [2024-07-15 20:34:02.388028] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:10.233 [2024-07-15 20:34:02.388040] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfb3c20 name raid_bdev1, state offline 00:20:10.233 0 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1427418 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1427418 ']' 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1427418 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1427418 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1427418' 00:20:10.233 killing process with pid 1427418 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1427418 00:20:10.233 [2024-07-15 20:34:02.450942] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:10.233 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1427418 00:20:10.233 [2024-07-15 20:34:02.482726] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:10.493 20:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.oaWWIYa8Rj 00:20:10.493 20:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:10.493 20:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:10.493 20:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:20:10.493 20:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:20:10.493 20:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:10.493 20:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:10.493 20:34:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:20:10.493 00:20:10.493 real 0m7.818s 00:20:10.493 user 0m12.560s 00:20:10.493 sys 0m1.398s 00:20:10.493 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:10.493 20:34:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:10.493 ************************************ 00:20:10.493 END TEST raid_write_error_test 00:20:10.493 ************************************ 00:20:10.493 20:34:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:10.493 20:34:02 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:10.493 20:34:02 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:20:10.493 20:34:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:10.493 20:34:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:10.493 20:34:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:10.493 ************************************ 00:20:10.493 START TEST raid_state_function_test 00:20:10.493 ************************************ 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1428552 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1428552' 00:20:10.493 Process raid pid: 1428552 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1428552 /var/tmp/spdk-raid.sock 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1428552 ']' 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:10.493 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:10.493 20:34:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:10.753 [2024-07-15 20:34:02.886732] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:20:10.753 [2024-07-15 20:34:02.886810] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:10.753 [2024-07-15 20:34:03.017968] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:10.753 [2024-07-15 20:34:03.122824] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:11.012 [2024-07-15 20:34:03.183563] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:11.012 [2024-07-15 20:34:03.183601] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:11.579 20:34:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:11.579 20:34:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:11.579 20:34:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:11.838 [2024-07-15 20:34:04.036201] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:11.838 [2024-07-15 20:34:04.036247] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:11.838 [2024-07-15 20:34:04.036258] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:11.838 [2024-07-15 20:34:04.036270] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:11.838 [2024-07-15 20:34:04.036279] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:11.838 [2024-07-15 20:34:04.036289] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:11.838 [2024-07-15 20:34:04.036298] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:11.838 [2024-07-15 20:34:04.036309] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.838 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:12.097 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.097 "name": "Existed_Raid", 00:20:12.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.097 "strip_size_kb": 64, 00:20:12.097 "state": "configuring", 00:20:12.097 "raid_level": "concat", 00:20:12.097 "superblock": false, 00:20:12.097 "num_base_bdevs": 4, 00:20:12.097 "num_base_bdevs_discovered": 0, 00:20:12.097 "num_base_bdevs_operational": 4, 00:20:12.097 "base_bdevs_list": [ 00:20:12.097 { 00:20:12.097 "name": "BaseBdev1", 00:20:12.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.097 "is_configured": false, 00:20:12.097 "data_offset": 0, 00:20:12.097 "data_size": 0 00:20:12.097 }, 00:20:12.097 { 00:20:12.097 "name": "BaseBdev2", 00:20:12.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.097 "is_configured": false, 00:20:12.097 "data_offset": 0, 00:20:12.097 "data_size": 0 00:20:12.097 }, 00:20:12.097 { 00:20:12.097 "name": "BaseBdev3", 00:20:12.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.097 "is_configured": false, 00:20:12.097 "data_offset": 0, 00:20:12.097 "data_size": 0 00:20:12.097 }, 00:20:12.097 { 00:20:12.097 "name": "BaseBdev4", 00:20:12.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.097 "is_configured": false, 00:20:12.097 "data_offset": 0, 00:20:12.097 "data_size": 0 00:20:12.097 } 00:20:12.097 ] 00:20:12.097 }' 00:20:12.097 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.097 20:34:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:12.666 20:34:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:12.925 [2024-07-15 20:34:05.082847] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:12.925 [2024-07-15 20:34:05.082879] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cb3aa0 name Existed_Raid, state configuring 00:20:12.925 20:34:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:12.925 [2024-07-15 20:34:05.267360] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:12.925 [2024-07-15 20:34:05.267389] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:12.925 [2024-07-15 20:34:05.267398] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:12.925 [2024-07-15 20:34:05.267409] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:12.925 [2024-07-15 20:34:05.267418] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:12.925 [2024-07-15 20:34:05.267429] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:12.925 [2024-07-15 20:34:05.267437] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:12.925 [2024-07-15 20:34:05.267448] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:12.925 20:34:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:13.185 [2024-07-15 20:34:05.538067] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:13.185 BaseBdev1 00:20:13.445 20:34:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:13.445 20:34:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:13.445 20:34:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:13.445 20:34:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:13.445 20:34:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:13.445 20:34:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:13.445 20:34:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:13.703 20:34:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:13.703 [ 00:20:13.703 { 00:20:13.703 "name": "BaseBdev1", 00:20:13.703 "aliases": [ 00:20:13.703 "569ae314-35b0-4a50-bd37-ac2a38f0eaf2" 00:20:13.703 ], 00:20:13.703 "product_name": "Malloc disk", 00:20:13.703 "block_size": 512, 00:20:13.703 "num_blocks": 65536, 00:20:13.703 "uuid": "569ae314-35b0-4a50-bd37-ac2a38f0eaf2", 00:20:13.703 "assigned_rate_limits": { 00:20:13.703 "rw_ios_per_sec": 0, 00:20:13.703 "rw_mbytes_per_sec": 0, 00:20:13.703 "r_mbytes_per_sec": 0, 00:20:13.703 "w_mbytes_per_sec": 0 00:20:13.703 }, 00:20:13.703 "claimed": true, 00:20:13.703 "claim_type": "exclusive_write", 00:20:13.703 "zoned": false, 00:20:13.703 "supported_io_types": { 00:20:13.703 "read": true, 00:20:13.703 "write": true, 00:20:13.703 "unmap": true, 00:20:13.703 "flush": true, 00:20:13.703 "reset": true, 00:20:13.703 "nvme_admin": false, 00:20:13.703 "nvme_io": false, 00:20:13.703 "nvme_io_md": false, 00:20:13.703 "write_zeroes": true, 00:20:13.703 "zcopy": true, 00:20:13.703 "get_zone_info": false, 00:20:13.703 "zone_management": false, 00:20:13.703 "zone_append": false, 00:20:13.703 "compare": false, 00:20:13.703 "compare_and_write": false, 00:20:13.703 "abort": true, 00:20:13.703 "seek_hole": false, 00:20:13.703 "seek_data": false, 00:20:13.703 "copy": true, 00:20:13.703 "nvme_iov_md": false 00:20:13.703 }, 00:20:13.703 "memory_domains": [ 00:20:13.703 { 00:20:13.703 "dma_device_id": "system", 00:20:13.703 "dma_device_type": 1 00:20:13.703 }, 00:20:13.703 { 00:20:13.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.703 "dma_device_type": 2 00:20:13.704 } 00:20:13.704 ], 00:20:13.704 "driver_specific": {} 00:20:13.704 } 00:20:13.704 ] 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.704 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:13.962 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:13.962 "name": "Existed_Raid", 00:20:13.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.962 "strip_size_kb": 64, 00:20:13.962 "state": "configuring", 00:20:13.962 "raid_level": "concat", 00:20:13.962 "superblock": false, 00:20:13.962 "num_base_bdevs": 4, 00:20:13.962 "num_base_bdevs_discovered": 1, 00:20:13.962 "num_base_bdevs_operational": 4, 00:20:13.962 "base_bdevs_list": [ 00:20:13.962 { 00:20:13.962 "name": "BaseBdev1", 00:20:13.962 "uuid": "569ae314-35b0-4a50-bd37-ac2a38f0eaf2", 00:20:13.962 "is_configured": true, 00:20:13.962 "data_offset": 0, 00:20:13.962 "data_size": 65536 00:20:13.962 }, 00:20:13.962 { 00:20:13.962 "name": "BaseBdev2", 00:20:13.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.962 "is_configured": false, 00:20:13.962 "data_offset": 0, 00:20:13.962 "data_size": 0 00:20:13.962 }, 00:20:13.962 { 00:20:13.962 "name": "BaseBdev3", 00:20:13.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.962 "is_configured": false, 00:20:13.962 "data_offset": 0, 00:20:13.962 "data_size": 0 00:20:13.962 }, 00:20:13.962 { 00:20:13.962 "name": "BaseBdev4", 00:20:13.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.962 "is_configured": false, 00:20:13.962 "data_offset": 0, 00:20:13.962 "data_size": 0 00:20:13.962 } 00:20:13.962 ] 00:20:13.962 }' 00:20:13.962 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:13.962 20:34:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:14.529 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:14.788 [2024-07-15 20:34:06.965861] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:14.788 [2024-07-15 20:34:06.965906] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cb3310 name Existed_Raid, state configuring 00:20:14.788 20:34:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:15.047 [2024-07-15 20:34:07.214557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:15.047 [2024-07-15 20:34:07.216012] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:15.047 [2024-07-15 20:34:07.216047] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:15.047 [2024-07-15 20:34:07.216057] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:15.047 [2024-07-15 20:34:07.216069] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:15.047 [2024-07-15 20:34:07.216078] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:15.047 [2024-07-15 20:34:07.216089] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.047 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:15.306 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:15.306 "name": "Existed_Raid", 00:20:15.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.306 "strip_size_kb": 64, 00:20:15.306 "state": "configuring", 00:20:15.306 "raid_level": "concat", 00:20:15.306 "superblock": false, 00:20:15.306 "num_base_bdevs": 4, 00:20:15.306 "num_base_bdevs_discovered": 1, 00:20:15.306 "num_base_bdevs_operational": 4, 00:20:15.306 "base_bdevs_list": [ 00:20:15.306 { 00:20:15.306 "name": "BaseBdev1", 00:20:15.306 "uuid": "569ae314-35b0-4a50-bd37-ac2a38f0eaf2", 00:20:15.306 "is_configured": true, 00:20:15.306 "data_offset": 0, 00:20:15.306 "data_size": 65536 00:20:15.306 }, 00:20:15.306 { 00:20:15.306 "name": "BaseBdev2", 00:20:15.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.306 "is_configured": false, 00:20:15.306 "data_offset": 0, 00:20:15.306 "data_size": 0 00:20:15.306 }, 00:20:15.306 { 00:20:15.306 "name": "BaseBdev3", 00:20:15.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.306 "is_configured": false, 00:20:15.306 "data_offset": 0, 00:20:15.306 "data_size": 0 00:20:15.306 }, 00:20:15.306 { 00:20:15.306 "name": "BaseBdev4", 00:20:15.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.306 "is_configured": false, 00:20:15.306 "data_offset": 0, 00:20:15.306 "data_size": 0 00:20:15.306 } 00:20:15.306 ] 00:20:15.306 }' 00:20:15.306 20:34:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:15.306 20:34:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:15.872 20:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:16.131 [2024-07-15 20:34:08.326125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:16.131 BaseBdev2 00:20:16.131 20:34:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:16.131 20:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:16.131 20:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:16.131 20:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:16.131 20:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:16.131 20:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:16.131 20:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:16.700 20:34:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:16.959 [ 00:20:16.959 { 00:20:16.959 "name": "BaseBdev2", 00:20:16.959 "aliases": [ 00:20:16.959 "44d78bab-2df7-4d7e-afeb-f7a323b61bc2" 00:20:16.959 ], 00:20:16.959 "product_name": "Malloc disk", 00:20:16.959 "block_size": 512, 00:20:16.959 "num_blocks": 65536, 00:20:16.959 "uuid": "44d78bab-2df7-4d7e-afeb-f7a323b61bc2", 00:20:16.959 "assigned_rate_limits": { 00:20:16.959 "rw_ios_per_sec": 0, 00:20:16.959 "rw_mbytes_per_sec": 0, 00:20:16.959 "r_mbytes_per_sec": 0, 00:20:16.959 "w_mbytes_per_sec": 0 00:20:16.959 }, 00:20:16.959 "claimed": true, 00:20:16.959 "claim_type": "exclusive_write", 00:20:16.959 "zoned": false, 00:20:16.959 "supported_io_types": { 00:20:16.959 "read": true, 00:20:16.959 "write": true, 00:20:16.959 "unmap": true, 00:20:16.959 "flush": true, 00:20:16.959 "reset": true, 00:20:16.959 "nvme_admin": false, 00:20:16.959 "nvme_io": false, 00:20:16.959 "nvme_io_md": false, 00:20:16.959 "write_zeroes": true, 00:20:16.959 "zcopy": true, 00:20:16.959 "get_zone_info": false, 00:20:16.959 "zone_management": false, 00:20:16.959 "zone_append": false, 00:20:16.959 "compare": false, 00:20:16.959 "compare_and_write": false, 00:20:16.959 "abort": true, 00:20:16.959 "seek_hole": false, 00:20:16.959 "seek_data": false, 00:20:16.959 "copy": true, 00:20:16.959 "nvme_iov_md": false 00:20:16.959 }, 00:20:16.959 "memory_domains": [ 00:20:16.959 { 00:20:16.959 "dma_device_id": "system", 00:20:16.959 "dma_device_type": 1 00:20:16.959 }, 00:20:16.959 { 00:20:16.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.959 "dma_device_type": 2 00:20:16.959 } 00:20:16.959 ], 00:20:16.959 "driver_specific": {} 00:20:16.959 } 00:20:16.959 ] 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.959 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.960 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:16.960 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.219 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.219 "name": "Existed_Raid", 00:20:17.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.219 "strip_size_kb": 64, 00:20:17.219 "state": "configuring", 00:20:17.219 "raid_level": "concat", 00:20:17.219 "superblock": false, 00:20:17.219 "num_base_bdevs": 4, 00:20:17.219 "num_base_bdevs_discovered": 2, 00:20:17.219 "num_base_bdevs_operational": 4, 00:20:17.219 "base_bdevs_list": [ 00:20:17.219 { 00:20:17.219 "name": "BaseBdev1", 00:20:17.219 "uuid": "569ae314-35b0-4a50-bd37-ac2a38f0eaf2", 00:20:17.219 "is_configured": true, 00:20:17.219 "data_offset": 0, 00:20:17.219 "data_size": 65536 00:20:17.219 }, 00:20:17.219 { 00:20:17.219 "name": "BaseBdev2", 00:20:17.219 "uuid": "44d78bab-2df7-4d7e-afeb-f7a323b61bc2", 00:20:17.219 "is_configured": true, 00:20:17.219 "data_offset": 0, 00:20:17.219 "data_size": 65536 00:20:17.219 }, 00:20:17.219 { 00:20:17.219 "name": "BaseBdev3", 00:20:17.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.219 "is_configured": false, 00:20:17.219 "data_offset": 0, 00:20:17.219 "data_size": 0 00:20:17.219 }, 00:20:17.219 { 00:20:17.219 "name": "BaseBdev4", 00:20:17.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.219 "is_configured": false, 00:20:17.219 "data_offset": 0, 00:20:17.219 "data_size": 0 00:20:17.219 } 00:20:17.219 ] 00:20:17.219 }' 00:20:17.219 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.219 20:34:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:17.787 20:34:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:18.046 [2024-07-15 20:34:10.186479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:18.046 BaseBdev3 00:20:18.046 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:18.046 20:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:18.046 20:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:18.046 20:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:18.046 20:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:18.046 20:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:18.046 20:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:18.305 [ 00:20:18.305 { 00:20:18.305 "name": "BaseBdev3", 00:20:18.305 "aliases": [ 00:20:18.305 "cac1121c-1df0-4d74-887c-c627b82d787f" 00:20:18.305 ], 00:20:18.305 "product_name": "Malloc disk", 00:20:18.305 "block_size": 512, 00:20:18.305 "num_blocks": 65536, 00:20:18.305 "uuid": "cac1121c-1df0-4d74-887c-c627b82d787f", 00:20:18.305 "assigned_rate_limits": { 00:20:18.305 "rw_ios_per_sec": 0, 00:20:18.305 "rw_mbytes_per_sec": 0, 00:20:18.305 "r_mbytes_per_sec": 0, 00:20:18.305 "w_mbytes_per_sec": 0 00:20:18.305 }, 00:20:18.305 "claimed": true, 00:20:18.305 "claim_type": "exclusive_write", 00:20:18.305 "zoned": false, 00:20:18.305 "supported_io_types": { 00:20:18.305 "read": true, 00:20:18.305 "write": true, 00:20:18.305 "unmap": true, 00:20:18.305 "flush": true, 00:20:18.305 "reset": true, 00:20:18.305 "nvme_admin": false, 00:20:18.305 "nvme_io": false, 00:20:18.305 "nvme_io_md": false, 00:20:18.305 "write_zeroes": true, 00:20:18.305 "zcopy": true, 00:20:18.305 "get_zone_info": false, 00:20:18.305 "zone_management": false, 00:20:18.305 "zone_append": false, 00:20:18.305 "compare": false, 00:20:18.305 "compare_and_write": false, 00:20:18.305 "abort": true, 00:20:18.305 "seek_hole": false, 00:20:18.305 "seek_data": false, 00:20:18.305 "copy": true, 00:20:18.305 "nvme_iov_md": false 00:20:18.305 }, 00:20:18.305 "memory_domains": [ 00:20:18.305 { 00:20:18.305 "dma_device_id": "system", 00:20:18.305 "dma_device_type": 1 00:20:18.305 }, 00:20:18.305 { 00:20:18.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:18.305 "dma_device_type": 2 00:20:18.305 } 00:20:18.305 ], 00:20:18.305 "driver_specific": {} 00:20:18.305 } 00:20:18.305 ] 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.305 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:18.564 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:18.564 "name": "Existed_Raid", 00:20:18.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.564 "strip_size_kb": 64, 00:20:18.564 "state": "configuring", 00:20:18.564 "raid_level": "concat", 00:20:18.564 "superblock": false, 00:20:18.564 "num_base_bdevs": 4, 00:20:18.564 "num_base_bdevs_discovered": 3, 00:20:18.564 "num_base_bdevs_operational": 4, 00:20:18.564 "base_bdevs_list": [ 00:20:18.564 { 00:20:18.564 "name": "BaseBdev1", 00:20:18.564 "uuid": "569ae314-35b0-4a50-bd37-ac2a38f0eaf2", 00:20:18.564 "is_configured": true, 00:20:18.564 "data_offset": 0, 00:20:18.564 "data_size": 65536 00:20:18.564 }, 00:20:18.564 { 00:20:18.564 "name": "BaseBdev2", 00:20:18.564 "uuid": "44d78bab-2df7-4d7e-afeb-f7a323b61bc2", 00:20:18.564 "is_configured": true, 00:20:18.564 "data_offset": 0, 00:20:18.564 "data_size": 65536 00:20:18.564 }, 00:20:18.564 { 00:20:18.564 "name": "BaseBdev3", 00:20:18.564 "uuid": "cac1121c-1df0-4d74-887c-c627b82d787f", 00:20:18.564 "is_configured": true, 00:20:18.564 "data_offset": 0, 00:20:18.564 "data_size": 65536 00:20:18.564 }, 00:20:18.564 { 00:20:18.564 "name": "BaseBdev4", 00:20:18.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.564 "is_configured": false, 00:20:18.564 "data_offset": 0, 00:20:18.564 "data_size": 0 00:20:18.564 } 00:20:18.564 ] 00:20:18.564 }' 00:20:18.564 20:34:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:18.564 20:34:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:19.131 20:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:19.388 [2024-07-15 20:34:11.713943] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:19.388 [2024-07-15 20:34:11.713989] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cb4350 00:20:19.388 [2024-07-15 20:34:11.713998] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:19.388 [2024-07-15 20:34:11.714249] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cb4020 00:20:19.388 [2024-07-15 20:34:11.714373] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cb4350 00:20:19.388 [2024-07-15 20:34:11.714383] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cb4350 00:20:19.388 [2024-07-15 20:34:11.714547] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:19.388 BaseBdev4 00:20:19.388 20:34:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:19.389 20:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:19.389 20:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:19.389 20:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:19.389 20:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:19.389 20:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:19.389 20:34:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:19.956 20:34:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:20.216 [ 00:20:20.216 { 00:20:20.216 "name": "BaseBdev4", 00:20:20.216 "aliases": [ 00:20:20.216 "482cb311-637c-4158-a148-bd16720f30ab" 00:20:20.216 ], 00:20:20.216 "product_name": "Malloc disk", 00:20:20.216 "block_size": 512, 00:20:20.216 "num_blocks": 65536, 00:20:20.216 "uuid": "482cb311-637c-4158-a148-bd16720f30ab", 00:20:20.216 "assigned_rate_limits": { 00:20:20.216 "rw_ios_per_sec": 0, 00:20:20.216 "rw_mbytes_per_sec": 0, 00:20:20.216 "r_mbytes_per_sec": 0, 00:20:20.216 "w_mbytes_per_sec": 0 00:20:20.216 }, 00:20:20.216 "claimed": true, 00:20:20.216 "claim_type": "exclusive_write", 00:20:20.216 "zoned": false, 00:20:20.216 "supported_io_types": { 00:20:20.216 "read": true, 00:20:20.216 "write": true, 00:20:20.216 "unmap": true, 00:20:20.216 "flush": true, 00:20:20.216 "reset": true, 00:20:20.216 "nvme_admin": false, 00:20:20.216 "nvme_io": false, 00:20:20.216 "nvme_io_md": false, 00:20:20.216 "write_zeroes": true, 00:20:20.216 "zcopy": true, 00:20:20.216 "get_zone_info": false, 00:20:20.216 "zone_management": false, 00:20:20.216 "zone_append": false, 00:20:20.216 "compare": false, 00:20:20.216 "compare_and_write": false, 00:20:20.216 "abort": true, 00:20:20.216 "seek_hole": false, 00:20:20.216 "seek_data": false, 00:20:20.216 "copy": true, 00:20:20.216 "nvme_iov_md": false 00:20:20.216 }, 00:20:20.216 "memory_domains": [ 00:20:20.216 { 00:20:20.216 "dma_device_id": "system", 00:20:20.216 "dma_device_type": 1 00:20:20.216 }, 00:20:20.216 { 00:20:20.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:20.216 "dma_device_type": 2 00:20:20.216 } 00:20:20.216 ], 00:20:20.216 "driver_specific": {} 00:20:20.216 } 00:20:20.216 ] 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.216 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:20.475 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.475 "name": "Existed_Raid", 00:20:20.475 "uuid": "02570e25-f28e-4c52-9536-78344e7bf595", 00:20:20.475 "strip_size_kb": 64, 00:20:20.475 "state": "online", 00:20:20.475 "raid_level": "concat", 00:20:20.475 "superblock": false, 00:20:20.475 "num_base_bdevs": 4, 00:20:20.475 "num_base_bdevs_discovered": 4, 00:20:20.475 "num_base_bdevs_operational": 4, 00:20:20.475 "base_bdevs_list": [ 00:20:20.475 { 00:20:20.475 "name": "BaseBdev1", 00:20:20.475 "uuid": "569ae314-35b0-4a50-bd37-ac2a38f0eaf2", 00:20:20.475 "is_configured": true, 00:20:20.475 "data_offset": 0, 00:20:20.475 "data_size": 65536 00:20:20.475 }, 00:20:20.475 { 00:20:20.475 "name": "BaseBdev2", 00:20:20.475 "uuid": "44d78bab-2df7-4d7e-afeb-f7a323b61bc2", 00:20:20.475 "is_configured": true, 00:20:20.475 "data_offset": 0, 00:20:20.475 "data_size": 65536 00:20:20.475 }, 00:20:20.475 { 00:20:20.475 "name": "BaseBdev3", 00:20:20.475 "uuid": "cac1121c-1df0-4d74-887c-c627b82d787f", 00:20:20.475 "is_configured": true, 00:20:20.475 "data_offset": 0, 00:20:20.475 "data_size": 65536 00:20:20.475 }, 00:20:20.475 { 00:20:20.475 "name": "BaseBdev4", 00:20:20.475 "uuid": "482cb311-637c-4158-a148-bd16720f30ab", 00:20:20.475 "is_configured": true, 00:20:20.475 "data_offset": 0, 00:20:20.475 "data_size": 65536 00:20:20.475 } 00:20:20.475 ] 00:20:20.475 }' 00:20:20.475 20:34:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.475 20:34:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:21.041 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:21.041 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:21.041 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:21.041 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:21.041 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:21.041 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:21.041 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:21.041 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:21.300 [2024-07-15 20:34:13.547167] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:21.300 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:21.300 "name": "Existed_Raid", 00:20:21.300 "aliases": [ 00:20:21.300 "02570e25-f28e-4c52-9536-78344e7bf595" 00:20:21.300 ], 00:20:21.300 "product_name": "Raid Volume", 00:20:21.300 "block_size": 512, 00:20:21.300 "num_blocks": 262144, 00:20:21.300 "uuid": "02570e25-f28e-4c52-9536-78344e7bf595", 00:20:21.300 "assigned_rate_limits": { 00:20:21.300 "rw_ios_per_sec": 0, 00:20:21.300 "rw_mbytes_per_sec": 0, 00:20:21.300 "r_mbytes_per_sec": 0, 00:20:21.300 "w_mbytes_per_sec": 0 00:20:21.300 }, 00:20:21.300 "claimed": false, 00:20:21.300 "zoned": false, 00:20:21.300 "supported_io_types": { 00:20:21.300 "read": true, 00:20:21.300 "write": true, 00:20:21.300 "unmap": true, 00:20:21.300 "flush": true, 00:20:21.300 "reset": true, 00:20:21.300 "nvme_admin": false, 00:20:21.300 "nvme_io": false, 00:20:21.300 "nvme_io_md": false, 00:20:21.300 "write_zeroes": true, 00:20:21.300 "zcopy": false, 00:20:21.300 "get_zone_info": false, 00:20:21.300 "zone_management": false, 00:20:21.300 "zone_append": false, 00:20:21.300 "compare": false, 00:20:21.300 "compare_and_write": false, 00:20:21.300 "abort": false, 00:20:21.300 "seek_hole": false, 00:20:21.300 "seek_data": false, 00:20:21.300 "copy": false, 00:20:21.300 "nvme_iov_md": false 00:20:21.300 }, 00:20:21.300 "memory_domains": [ 00:20:21.300 { 00:20:21.300 "dma_device_id": "system", 00:20:21.300 "dma_device_type": 1 00:20:21.300 }, 00:20:21.300 { 00:20:21.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.300 "dma_device_type": 2 00:20:21.300 }, 00:20:21.300 { 00:20:21.300 "dma_device_id": "system", 00:20:21.300 "dma_device_type": 1 00:20:21.300 }, 00:20:21.300 { 00:20:21.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.300 "dma_device_type": 2 00:20:21.300 }, 00:20:21.300 { 00:20:21.300 "dma_device_id": "system", 00:20:21.300 "dma_device_type": 1 00:20:21.300 }, 00:20:21.300 { 00:20:21.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.300 "dma_device_type": 2 00:20:21.300 }, 00:20:21.300 { 00:20:21.300 "dma_device_id": "system", 00:20:21.300 "dma_device_type": 1 00:20:21.300 }, 00:20:21.300 { 00:20:21.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.300 "dma_device_type": 2 00:20:21.300 } 00:20:21.300 ], 00:20:21.300 "driver_specific": { 00:20:21.300 "raid": { 00:20:21.300 "uuid": "02570e25-f28e-4c52-9536-78344e7bf595", 00:20:21.300 "strip_size_kb": 64, 00:20:21.300 "state": "online", 00:20:21.300 "raid_level": "concat", 00:20:21.300 "superblock": false, 00:20:21.300 "num_base_bdevs": 4, 00:20:21.300 "num_base_bdevs_discovered": 4, 00:20:21.300 "num_base_bdevs_operational": 4, 00:20:21.300 "base_bdevs_list": [ 00:20:21.300 { 00:20:21.300 "name": "BaseBdev1", 00:20:21.300 "uuid": "569ae314-35b0-4a50-bd37-ac2a38f0eaf2", 00:20:21.300 "is_configured": true, 00:20:21.300 "data_offset": 0, 00:20:21.300 "data_size": 65536 00:20:21.300 }, 00:20:21.300 { 00:20:21.300 "name": "BaseBdev2", 00:20:21.300 "uuid": "44d78bab-2df7-4d7e-afeb-f7a323b61bc2", 00:20:21.300 "is_configured": true, 00:20:21.300 "data_offset": 0, 00:20:21.300 "data_size": 65536 00:20:21.300 }, 00:20:21.300 { 00:20:21.300 "name": "BaseBdev3", 00:20:21.300 "uuid": "cac1121c-1df0-4d74-887c-c627b82d787f", 00:20:21.300 "is_configured": true, 00:20:21.300 "data_offset": 0, 00:20:21.300 "data_size": 65536 00:20:21.300 }, 00:20:21.300 { 00:20:21.300 "name": "BaseBdev4", 00:20:21.300 "uuid": "482cb311-637c-4158-a148-bd16720f30ab", 00:20:21.300 "is_configured": true, 00:20:21.300 "data_offset": 0, 00:20:21.300 "data_size": 65536 00:20:21.300 } 00:20:21.300 ] 00:20:21.300 } 00:20:21.300 } 00:20:21.300 }' 00:20:21.300 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:21.300 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:21.300 BaseBdev2 00:20:21.300 BaseBdev3 00:20:21.300 BaseBdev4' 00:20:21.300 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:21.300 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:21.300 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:21.559 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:21.559 "name": "BaseBdev1", 00:20:21.559 "aliases": [ 00:20:21.559 "569ae314-35b0-4a50-bd37-ac2a38f0eaf2" 00:20:21.559 ], 00:20:21.559 "product_name": "Malloc disk", 00:20:21.559 "block_size": 512, 00:20:21.559 "num_blocks": 65536, 00:20:21.559 "uuid": "569ae314-35b0-4a50-bd37-ac2a38f0eaf2", 00:20:21.559 "assigned_rate_limits": { 00:20:21.559 "rw_ios_per_sec": 0, 00:20:21.559 "rw_mbytes_per_sec": 0, 00:20:21.559 "r_mbytes_per_sec": 0, 00:20:21.559 "w_mbytes_per_sec": 0 00:20:21.559 }, 00:20:21.559 "claimed": true, 00:20:21.559 "claim_type": "exclusive_write", 00:20:21.559 "zoned": false, 00:20:21.559 "supported_io_types": { 00:20:21.559 "read": true, 00:20:21.559 "write": true, 00:20:21.559 "unmap": true, 00:20:21.559 "flush": true, 00:20:21.559 "reset": true, 00:20:21.559 "nvme_admin": false, 00:20:21.559 "nvme_io": false, 00:20:21.559 "nvme_io_md": false, 00:20:21.559 "write_zeroes": true, 00:20:21.559 "zcopy": true, 00:20:21.559 "get_zone_info": false, 00:20:21.559 "zone_management": false, 00:20:21.559 "zone_append": false, 00:20:21.559 "compare": false, 00:20:21.559 "compare_and_write": false, 00:20:21.559 "abort": true, 00:20:21.559 "seek_hole": false, 00:20:21.559 "seek_data": false, 00:20:21.559 "copy": true, 00:20:21.559 "nvme_iov_md": false 00:20:21.559 }, 00:20:21.559 "memory_domains": [ 00:20:21.559 { 00:20:21.559 "dma_device_id": "system", 00:20:21.559 "dma_device_type": 1 00:20:21.559 }, 00:20:21.559 { 00:20:21.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.559 "dma_device_type": 2 00:20:21.559 } 00:20:21.559 ], 00:20:21.559 "driver_specific": {} 00:20:21.559 }' 00:20:21.559 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:21.559 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:21.559 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:21.559 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:21.559 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:21.817 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:21.817 20:34:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:21.817 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:21.817 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:21.817 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:21.817 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:21.817 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:21.817 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:21.817 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:21.817 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:22.075 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:22.075 "name": "BaseBdev2", 00:20:22.075 "aliases": [ 00:20:22.075 "44d78bab-2df7-4d7e-afeb-f7a323b61bc2" 00:20:22.075 ], 00:20:22.075 "product_name": "Malloc disk", 00:20:22.075 "block_size": 512, 00:20:22.075 "num_blocks": 65536, 00:20:22.075 "uuid": "44d78bab-2df7-4d7e-afeb-f7a323b61bc2", 00:20:22.075 "assigned_rate_limits": { 00:20:22.075 "rw_ios_per_sec": 0, 00:20:22.075 "rw_mbytes_per_sec": 0, 00:20:22.075 "r_mbytes_per_sec": 0, 00:20:22.075 "w_mbytes_per_sec": 0 00:20:22.075 }, 00:20:22.075 "claimed": true, 00:20:22.075 "claim_type": "exclusive_write", 00:20:22.075 "zoned": false, 00:20:22.075 "supported_io_types": { 00:20:22.075 "read": true, 00:20:22.075 "write": true, 00:20:22.075 "unmap": true, 00:20:22.075 "flush": true, 00:20:22.075 "reset": true, 00:20:22.075 "nvme_admin": false, 00:20:22.075 "nvme_io": false, 00:20:22.075 "nvme_io_md": false, 00:20:22.075 "write_zeroes": true, 00:20:22.075 "zcopy": true, 00:20:22.075 "get_zone_info": false, 00:20:22.075 "zone_management": false, 00:20:22.075 "zone_append": false, 00:20:22.075 "compare": false, 00:20:22.075 "compare_and_write": false, 00:20:22.075 "abort": true, 00:20:22.075 "seek_hole": false, 00:20:22.075 "seek_data": false, 00:20:22.075 "copy": true, 00:20:22.075 "nvme_iov_md": false 00:20:22.076 }, 00:20:22.076 "memory_domains": [ 00:20:22.076 { 00:20:22.076 "dma_device_id": "system", 00:20:22.076 "dma_device_type": 1 00:20:22.076 }, 00:20:22.076 { 00:20:22.076 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.076 "dma_device_type": 2 00:20:22.076 } 00:20:22.076 ], 00:20:22.076 "driver_specific": {} 00:20:22.076 }' 00:20:22.076 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.076 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.333 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:22.333 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.333 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.333 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:22.333 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.333 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.333 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:22.333 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.333 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.592 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:22.592 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.592 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:22.592 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:22.851 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:22.851 "name": "BaseBdev3", 00:20:22.851 "aliases": [ 00:20:22.851 "cac1121c-1df0-4d74-887c-c627b82d787f" 00:20:22.851 ], 00:20:22.851 "product_name": "Malloc disk", 00:20:22.851 "block_size": 512, 00:20:22.851 "num_blocks": 65536, 00:20:22.851 "uuid": "cac1121c-1df0-4d74-887c-c627b82d787f", 00:20:22.851 "assigned_rate_limits": { 00:20:22.851 "rw_ios_per_sec": 0, 00:20:22.851 "rw_mbytes_per_sec": 0, 00:20:22.851 "r_mbytes_per_sec": 0, 00:20:22.851 "w_mbytes_per_sec": 0 00:20:22.851 }, 00:20:22.851 "claimed": true, 00:20:22.851 "claim_type": "exclusive_write", 00:20:22.851 "zoned": false, 00:20:22.851 "supported_io_types": { 00:20:22.851 "read": true, 00:20:22.851 "write": true, 00:20:22.851 "unmap": true, 00:20:22.851 "flush": true, 00:20:22.851 "reset": true, 00:20:22.851 "nvme_admin": false, 00:20:22.851 "nvme_io": false, 00:20:22.851 "nvme_io_md": false, 00:20:22.851 "write_zeroes": true, 00:20:22.851 "zcopy": true, 00:20:22.851 "get_zone_info": false, 00:20:22.851 "zone_management": false, 00:20:22.851 "zone_append": false, 00:20:22.851 "compare": false, 00:20:22.851 "compare_and_write": false, 00:20:22.851 "abort": true, 00:20:22.851 "seek_hole": false, 00:20:22.851 "seek_data": false, 00:20:22.851 "copy": true, 00:20:22.851 "nvme_iov_md": false 00:20:22.851 }, 00:20:22.851 "memory_domains": [ 00:20:22.851 { 00:20:22.851 "dma_device_id": "system", 00:20:22.851 "dma_device_type": 1 00:20:22.851 }, 00:20:22.851 { 00:20:22.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.851 "dma_device_type": 2 00:20:22.851 } 00:20:22.851 ], 00:20:22.851 "driver_specific": {} 00:20:22.851 }' 00:20:22.851 20:34:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.851 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.851 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:22.851 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.851 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.851 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:22.851 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.851 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.108 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.108 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.108 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.108 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.108 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.108 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:23.108 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.365 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.365 "name": "BaseBdev4", 00:20:23.365 "aliases": [ 00:20:23.365 "482cb311-637c-4158-a148-bd16720f30ab" 00:20:23.365 ], 00:20:23.365 "product_name": "Malloc disk", 00:20:23.365 "block_size": 512, 00:20:23.365 "num_blocks": 65536, 00:20:23.365 "uuid": "482cb311-637c-4158-a148-bd16720f30ab", 00:20:23.365 "assigned_rate_limits": { 00:20:23.365 "rw_ios_per_sec": 0, 00:20:23.365 "rw_mbytes_per_sec": 0, 00:20:23.365 "r_mbytes_per_sec": 0, 00:20:23.365 "w_mbytes_per_sec": 0 00:20:23.365 }, 00:20:23.365 "claimed": true, 00:20:23.365 "claim_type": "exclusive_write", 00:20:23.365 "zoned": false, 00:20:23.365 "supported_io_types": { 00:20:23.365 "read": true, 00:20:23.365 "write": true, 00:20:23.365 "unmap": true, 00:20:23.365 "flush": true, 00:20:23.365 "reset": true, 00:20:23.365 "nvme_admin": false, 00:20:23.365 "nvme_io": false, 00:20:23.365 "nvme_io_md": false, 00:20:23.365 "write_zeroes": true, 00:20:23.365 "zcopy": true, 00:20:23.365 "get_zone_info": false, 00:20:23.365 "zone_management": false, 00:20:23.365 "zone_append": false, 00:20:23.365 "compare": false, 00:20:23.365 "compare_and_write": false, 00:20:23.365 "abort": true, 00:20:23.365 "seek_hole": false, 00:20:23.365 "seek_data": false, 00:20:23.365 "copy": true, 00:20:23.365 "nvme_iov_md": false 00:20:23.365 }, 00:20:23.365 "memory_domains": [ 00:20:23.365 { 00:20:23.365 "dma_device_id": "system", 00:20:23.365 "dma_device_type": 1 00:20:23.365 }, 00:20:23.365 { 00:20:23.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.365 "dma_device_type": 2 00:20:23.365 } 00:20:23.365 ], 00:20:23.365 "driver_specific": {} 00:20:23.365 }' 00:20:23.365 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.365 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.365 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.365 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.365 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.365 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.365 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.365 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.623 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.623 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.623 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.623 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.623 20:34:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:23.881 [2024-07-15 20:34:16.049579] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:23.881 [2024-07-15 20:34:16.049609] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:23.881 [2024-07-15 20:34:16.049658] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.881 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:24.140 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.140 "name": "Existed_Raid", 00:20:24.140 "uuid": "02570e25-f28e-4c52-9536-78344e7bf595", 00:20:24.140 "strip_size_kb": 64, 00:20:24.140 "state": "offline", 00:20:24.140 "raid_level": "concat", 00:20:24.140 "superblock": false, 00:20:24.140 "num_base_bdevs": 4, 00:20:24.140 "num_base_bdevs_discovered": 3, 00:20:24.140 "num_base_bdevs_operational": 3, 00:20:24.140 "base_bdevs_list": [ 00:20:24.140 { 00:20:24.140 "name": null, 00:20:24.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.140 "is_configured": false, 00:20:24.140 "data_offset": 0, 00:20:24.140 "data_size": 65536 00:20:24.140 }, 00:20:24.140 { 00:20:24.140 "name": "BaseBdev2", 00:20:24.140 "uuid": "44d78bab-2df7-4d7e-afeb-f7a323b61bc2", 00:20:24.140 "is_configured": true, 00:20:24.140 "data_offset": 0, 00:20:24.140 "data_size": 65536 00:20:24.140 }, 00:20:24.140 { 00:20:24.140 "name": "BaseBdev3", 00:20:24.140 "uuid": "cac1121c-1df0-4d74-887c-c627b82d787f", 00:20:24.140 "is_configured": true, 00:20:24.140 "data_offset": 0, 00:20:24.140 "data_size": 65536 00:20:24.140 }, 00:20:24.140 { 00:20:24.140 "name": "BaseBdev4", 00:20:24.140 "uuid": "482cb311-637c-4158-a148-bd16720f30ab", 00:20:24.140 "is_configured": true, 00:20:24.140 "data_offset": 0, 00:20:24.140 "data_size": 65536 00:20:24.140 } 00:20:24.140 ] 00:20:24.140 }' 00:20:24.140 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.140 20:34:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:24.704 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:24.704 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:24.704 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:24.704 20:34:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.962 20:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:24.962 20:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:24.962 20:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:24.962 [2024-07-15 20:34:17.334023] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:25.221 20:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:25.221 20:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:25.221 20:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.221 20:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:25.479 20:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:25.479 20:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:25.479 20:34:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:26.046 [2024-07-15 20:34:18.122905] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:26.046 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:26.046 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:26.046 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.046 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:26.046 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:26.046 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:26.046 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:26.305 [2024-07-15 20:34:18.636615] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:26.305 [2024-07-15 20:34:18.636669] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cb4350 name Existed_Raid, state offline 00:20:26.305 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:26.305 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:26.305 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.305 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:26.564 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:26.564 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:26.564 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:26.564 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:26.564 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:26.564 20:34:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:26.822 BaseBdev2 00:20:26.822 20:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:26.822 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:26.822 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:26.822 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:26.822 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:26.822 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:26.822 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:27.080 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:27.339 [ 00:20:27.339 { 00:20:27.339 "name": "BaseBdev2", 00:20:27.339 "aliases": [ 00:20:27.339 "28e36554-a60f-4315-af4f-452f3c60d6bf" 00:20:27.339 ], 00:20:27.339 "product_name": "Malloc disk", 00:20:27.339 "block_size": 512, 00:20:27.339 "num_blocks": 65536, 00:20:27.339 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:27.339 "assigned_rate_limits": { 00:20:27.339 "rw_ios_per_sec": 0, 00:20:27.339 "rw_mbytes_per_sec": 0, 00:20:27.339 "r_mbytes_per_sec": 0, 00:20:27.339 "w_mbytes_per_sec": 0 00:20:27.339 }, 00:20:27.339 "claimed": false, 00:20:27.339 "zoned": false, 00:20:27.339 "supported_io_types": { 00:20:27.339 "read": true, 00:20:27.339 "write": true, 00:20:27.339 "unmap": true, 00:20:27.339 "flush": true, 00:20:27.339 "reset": true, 00:20:27.339 "nvme_admin": false, 00:20:27.339 "nvme_io": false, 00:20:27.339 "nvme_io_md": false, 00:20:27.339 "write_zeroes": true, 00:20:27.339 "zcopy": true, 00:20:27.339 "get_zone_info": false, 00:20:27.339 "zone_management": false, 00:20:27.339 "zone_append": false, 00:20:27.339 "compare": false, 00:20:27.339 "compare_and_write": false, 00:20:27.339 "abort": true, 00:20:27.339 "seek_hole": false, 00:20:27.339 "seek_data": false, 00:20:27.339 "copy": true, 00:20:27.339 "nvme_iov_md": false 00:20:27.339 }, 00:20:27.339 "memory_domains": [ 00:20:27.339 { 00:20:27.339 "dma_device_id": "system", 00:20:27.339 "dma_device_type": 1 00:20:27.339 }, 00:20:27.339 { 00:20:27.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:27.339 "dma_device_type": 2 00:20:27.339 } 00:20:27.339 ], 00:20:27.339 "driver_specific": {} 00:20:27.339 } 00:20:27.339 ] 00:20:27.339 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:27.339 20:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:27.339 20:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:27.339 20:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:27.596 BaseBdev3 00:20:27.596 20:34:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:27.596 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:27.596 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:27.596 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:27.596 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:27.596 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:27.596 20:34:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:27.854 20:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:28.112 [ 00:20:28.112 { 00:20:28.112 "name": "BaseBdev3", 00:20:28.112 "aliases": [ 00:20:28.112 "0298f55f-9100-4f52-8058-2c547ef3a067" 00:20:28.112 ], 00:20:28.112 "product_name": "Malloc disk", 00:20:28.112 "block_size": 512, 00:20:28.112 "num_blocks": 65536, 00:20:28.112 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:28.112 "assigned_rate_limits": { 00:20:28.112 "rw_ios_per_sec": 0, 00:20:28.112 "rw_mbytes_per_sec": 0, 00:20:28.112 "r_mbytes_per_sec": 0, 00:20:28.112 "w_mbytes_per_sec": 0 00:20:28.112 }, 00:20:28.112 "claimed": false, 00:20:28.112 "zoned": false, 00:20:28.112 "supported_io_types": { 00:20:28.112 "read": true, 00:20:28.112 "write": true, 00:20:28.112 "unmap": true, 00:20:28.112 "flush": true, 00:20:28.112 "reset": true, 00:20:28.112 "nvme_admin": false, 00:20:28.112 "nvme_io": false, 00:20:28.112 "nvme_io_md": false, 00:20:28.112 "write_zeroes": true, 00:20:28.112 "zcopy": true, 00:20:28.112 "get_zone_info": false, 00:20:28.112 "zone_management": false, 00:20:28.112 "zone_append": false, 00:20:28.112 "compare": false, 00:20:28.112 "compare_and_write": false, 00:20:28.112 "abort": true, 00:20:28.112 "seek_hole": false, 00:20:28.112 "seek_data": false, 00:20:28.112 "copy": true, 00:20:28.112 "nvme_iov_md": false 00:20:28.112 }, 00:20:28.112 "memory_domains": [ 00:20:28.112 { 00:20:28.112 "dma_device_id": "system", 00:20:28.112 "dma_device_type": 1 00:20:28.112 }, 00:20:28.112 { 00:20:28.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.112 "dma_device_type": 2 00:20:28.112 } 00:20:28.112 ], 00:20:28.112 "driver_specific": {} 00:20:28.112 } 00:20:28.112 ] 00:20:28.112 20:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:28.112 20:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:28.112 20:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:28.112 20:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:28.371 BaseBdev4 00:20:28.371 20:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:28.371 20:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:28.371 20:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:28.371 20:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:28.371 20:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:28.371 20:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:28.371 20:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:28.629 20:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:28.629 [ 00:20:28.629 { 00:20:28.629 "name": "BaseBdev4", 00:20:28.629 "aliases": [ 00:20:28.629 "4b975e4b-3a57-4ab7-9621-d4998be37dbe" 00:20:28.629 ], 00:20:28.629 "product_name": "Malloc disk", 00:20:28.629 "block_size": 512, 00:20:28.629 "num_blocks": 65536, 00:20:28.629 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:28.629 "assigned_rate_limits": { 00:20:28.629 "rw_ios_per_sec": 0, 00:20:28.629 "rw_mbytes_per_sec": 0, 00:20:28.629 "r_mbytes_per_sec": 0, 00:20:28.629 "w_mbytes_per_sec": 0 00:20:28.629 }, 00:20:28.629 "claimed": false, 00:20:28.629 "zoned": false, 00:20:28.629 "supported_io_types": { 00:20:28.629 "read": true, 00:20:28.629 "write": true, 00:20:28.629 "unmap": true, 00:20:28.629 "flush": true, 00:20:28.629 "reset": true, 00:20:28.629 "nvme_admin": false, 00:20:28.629 "nvme_io": false, 00:20:28.629 "nvme_io_md": false, 00:20:28.629 "write_zeroes": true, 00:20:28.629 "zcopy": true, 00:20:28.629 "get_zone_info": false, 00:20:28.629 "zone_management": false, 00:20:28.629 "zone_append": false, 00:20:28.629 "compare": false, 00:20:28.629 "compare_and_write": false, 00:20:28.629 "abort": true, 00:20:28.629 "seek_hole": false, 00:20:28.629 "seek_data": false, 00:20:28.629 "copy": true, 00:20:28.629 "nvme_iov_md": false 00:20:28.629 }, 00:20:28.629 "memory_domains": [ 00:20:28.629 { 00:20:28.629 "dma_device_id": "system", 00:20:28.629 "dma_device_type": 1 00:20:28.629 }, 00:20:28.629 { 00:20:28.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.629 "dma_device_type": 2 00:20:28.629 } 00:20:28.629 ], 00:20:28.629 "driver_specific": {} 00:20:28.629 } 00:20:28.629 ] 00:20:28.629 20:34:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:28.629 20:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:28.629 20:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:28.629 20:34:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:28.888 [2024-07-15 20:34:21.214666] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:28.888 [2024-07-15 20:34:21.214713] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:28.888 [2024-07-15 20:34:21.214735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:28.888 [2024-07-15 20:34:21.216074] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:28.888 [2024-07-15 20:34:21.216118] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.888 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:29.146 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.146 "name": "Existed_Raid", 00:20:29.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.146 "strip_size_kb": 64, 00:20:29.146 "state": "configuring", 00:20:29.146 "raid_level": "concat", 00:20:29.146 "superblock": false, 00:20:29.146 "num_base_bdevs": 4, 00:20:29.146 "num_base_bdevs_discovered": 3, 00:20:29.146 "num_base_bdevs_operational": 4, 00:20:29.146 "base_bdevs_list": [ 00:20:29.146 { 00:20:29.146 "name": "BaseBdev1", 00:20:29.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.146 "is_configured": false, 00:20:29.146 "data_offset": 0, 00:20:29.146 "data_size": 0 00:20:29.146 }, 00:20:29.146 { 00:20:29.146 "name": "BaseBdev2", 00:20:29.146 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:29.146 "is_configured": true, 00:20:29.146 "data_offset": 0, 00:20:29.146 "data_size": 65536 00:20:29.146 }, 00:20:29.146 { 00:20:29.146 "name": "BaseBdev3", 00:20:29.146 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:29.146 "is_configured": true, 00:20:29.146 "data_offset": 0, 00:20:29.146 "data_size": 65536 00:20:29.146 }, 00:20:29.146 { 00:20:29.146 "name": "BaseBdev4", 00:20:29.146 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:29.146 "is_configured": true, 00:20:29.146 "data_offset": 0, 00:20:29.146 "data_size": 65536 00:20:29.146 } 00:20:29.146 ] 00:20:29.146 }' 00:20:29.146 20:34:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.146 20:34:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:29.712 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:29.971 [2024-07-15 20:34:22.301512] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:29.971 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.232 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:30.232 "name": "Existed_Raid", 00:20:30.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.232 "strip_size_kb": 64, 00:20:30.232 "state": "configuring", 00:20:30.232 "raid_level": "concat", 00:20:30.232 "superblock": false, 00:20:30.232 "num_base_bdevs": 4, 00:20:30.232 "num_base_bdevs_discovered": 2, 00:20:30.232 "num_base_bdevs_operational": 4, 00:20:30.232 "base_bdevs_list": [ 00:20:30.232 { 00:20:30.232 "name": "BaseBdev1", 00:20:30.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.232 "is_configured": false, 00:20:30.232 "data_offset": 0, 00:20:30.232 "data_size": 0 00:20:30.232 }, 00:20:30.232 { 00:20:30.232 "name": null, 00:20:30.232 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:30.232 "is_configured": false, 00:20:30.232 "data_offset": 0, 00:20:30.232 "data_size": 65536 00:20:30.232 }, 00:20:30.232 { 00:20:30.232 "name": "BaseBdev3", 00:20:30.232 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:30.232 "is_configured": true, 00:20:30.232 "data_offset": 0, 00:20:30.232 "data_size": 65536 00:20:30.232 }, 00:20:30.232 { 00:20:30.232 "name": "BaseBdev4", 00:20:30.232 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:30.232 "is_configured": true, 00:20:30.232 "data_offset": 0, 00:20:30.232 "data_size": 65536 00:20:30.232 } 00:20:30.232 ] 00:20:30.232 }' 00:20:30.232 20:34:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:30.232 20:34:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.833 20:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.833 20:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:31.092 20:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:31.092 20:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:31.356 [2024-07-15 20:34:23.589495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:31.356 BaseBdev1 00:20:31.356 20:34:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:31.356 20:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:31.356 20:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:31.356 20:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:31.356 20:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:31.356 20:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:31.356 20:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:31.621 20:34:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:31.880 [ 00:20:31.880 { 00:20:31.880 "name": "BaseBdev1", 00:20:31.880 "aliases": [ 00:20:31.880 "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957" 00:20:31.880 ], 00:20:31.880 "product_name": "Malloc disk", 00:20:31.880 "block_size": 512, 00:20:31.880 "num_blocks": 65536, 00:20:31.880 "uuid": "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957", 00:20:31.880 "assigned_rate_limits": { 00:20:31.880 "rw_ios_per_sec": 0, 00:20:31.880 "rw_mbytes_per_sec": 0, 00:20:31.880 "r_mbytes_per_sec": 0, 00:20:31.880 "w_mbytes_per_sec": 0 00:20:31.880 }, 00:20:31.880 "claimed": true, 00:20:31.880 "claim_type": "exclusive_write", 00:20:31.880 "zoned": false, 00:20:31.880 "supported_io_types": { 00:20:31.880 "read": true, 00:20:31.880 "write": true, 00:20:31.880 "unmap": true, 00:20:31.880 "flush": true, 00:20:31.880 "reset": true, 00:20:31.880 "nvme_admin": false, 00:20:31.880 "nvme_io": false, 00:20:31.880 "nvme_io_md": false, 00:20:31.880 "write_zeroes": true, 00:20:31.880 "zcopy": true, 00:20:31.880 "get_zone_info": false, 00:20:31.880 "zone_management": false, 00:20:31.880 "zone_append": false, 00:20:31.880 "compare": false, 00:20:31.880 "compare_and_write": false, 00:20:31.880 "abort": true, 00:20:31.880 "seek_hole": false, 00:20:31.880 "seek_data": false, 00:20:31.880 "copy": true, 00:20:31.880 "nvme_iov_md": false 00:20:31.880 }, 00:20:31.880 "memory_domains": [ 00:20:31.880 { 00:20:31.880 "dma_device_id": "system", 00:20:31.880 "dma_device_type": 1 00:20:31.880 }, 00:20:31.880 { 00:20:31.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.880 "dma_device_type": 2 00:20:31.880 } 00:20:31.880 ], 00:20:31.880 "driver_specific": {} 00:20:31.880 } 00:20:31.880 ] 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.880 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:32.139 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:32.139 "name": "Existed_Raid", 00:20:32.139 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:32.139 "strip_size_kb": 64, 00:20:32.139 "state": "configuring", 00:20:32.139 "raid_level": "concat", 00:20:32.139 "superblock": false, 00:20:32.139 "num_base_bdevs": 4, 00:20:32.139 "num_base_bdevs_discovered": 3, 00:20:32.139 "num_base_bdevs_operational": 4, 00:20:32.139 "base_bdevs_list": [ 00:20:32.139 { 00:20:32.139 "name": "BaseBdev1", 00:20:32.139 "uuid": "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957", 00:20:32.139 "is_configured": true, 00:20:32.139 "data_offset": 0, 00:20:32.139 "data_size": 65536 00:20:32.139 }, 00:20:32.139 { 00:20:32.139 "name": null, 00:20:32.139 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:32.139 "is_configured": false, 00:20:32.139 "data_offset": 0, 00:20:32.139 "data_size": 65536 00:20:32.139 }, 00:20:32.139 { 00:20:32.139 "name": "BaseBdev3", 00:20:32.139 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:32.139 "is_configured": true, 00:20:32.139 "data_offset": 0, 00:20:32.139 "data_size": 65536 00:20:32.139 }, 00:20:32.139 { 00:20:32.139 "name": "BaseBdev4", 00:20:32.139 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:32.139 "is_configured": true, 00:20:32.139 "data_offset": 0, 00:20:32.139 "data_size": 65536 00:20:32.139 } 00:20:32.139 ] 00:20:32.139 }' 00:20:32.139 20:34:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:32.139 20:34:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.707 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.707 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:32.966 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:32.966 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:33.535 [2024-07-15 20:34:25.747274] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:33.535 20:34:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.103 20:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.103 "name": "Existed_Raid", 00:20:34.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.103 "strip_size_kb": 64, 00:20:34.103 "state": "configuring", 00:20:34.103 "raid_level": "concat", 00:20:34.103 "superblock": false, 00:20:34.103 "num_base_bdevs": 4, 00:20:34.103 "num_base_bdevs_discovered": 2, 00:20:34.103 "num_base_bdevs_operational": 4, 00:20:34.104 "base_bdevs_list": [ 00:20:34.104 { 00:20:34.104 "name": "BaseBdev1", 00:20:34.104 "uuid": "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957", 00:20:34.104 "is_configured": true, 00:20:34.104 "data_offset": 0, 00:20:34.104 "data_size": 65536 00:20:34.104 }, 00:20:34.104 { 00:20:34.104 "name": null, 00:20:34.104 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:34.104 "is_configured": false, 00:20:34.104 "data_offset": 0, 00:20:34.104 "data_size": 65536 00:20:34.104 }, 00:20:34.104 { 00:20:34.104 "name": null, 00:20:34.104 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:34.104 "is_configured": false, 00:20:34.104 "data_offset": 0, 00:20:34.104 "data_size": 65536 00:20:34.104 }, 00:20:34.104 { 00:20:34.104 "name": "BaseBdev4", 00:20:34.104 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:34.104 "is_configured": true, 00:20:34.104 "data_offset": 0, 00:20:34.104 "data_size": 65536 00:20:34.104 } 00:20:34.104 ] 00:20:34.104 }' 00:20:34.104 20:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.104 20:34:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.671 20:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.671 20:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:34.671 20:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:34.671 20:34:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:34.931 [2024-07-15 20:34:27.134981] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.931 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:35.189 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:35.189 "name": "Existed_Raid", 00:20:35.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:35.189 "strip_size_kb": 64, 00:20:35.189 "state": "configuring", 00:20:35.189 "raid_level": "concat", 00:20:35.189 "superblock": false, 00:20:35.189 "num_base_bdevs": 4, 00:20:35.189 "num_base_bdevs_discovered": 3, 00:20:35.189 "num_base_bdevs_operational": 4, 00:20:35.189 "base_bdevs_list": [ 00:20:35.189 { 00:20:35.189 "name": "BaseBdev1", 00:20:35.189 "uuid": "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957", 00:20:35.189 "is_configured": true, 00:20:35.189 "data_offset": 0, 00:20:35.189 "data_size": 65536 00:20:35.189 }, 00:20:35.189 { 00:20:35.189 "name": null, 00:20:35.189 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:35.189 "is_configured": false, 00:20:35.189 "data_offset": 0, 00:20:35.189 "data_size": 65536 00:20:35.189 }, 00:20:35.189 { 00:20:35.189 "name": "BaseBdev3", 00:20:35.189 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:35.189 "is_configured": true, 00:20:35.189 "data_offset": 0, 00:20:35.189 "data_size": 65536 00:20:35.189 }, 00:20:35.189 { 00:20:35.189 "name": "BaseBdev4", 00:20:35.189 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:35.189 "is_configured": true, 00:20:35.189 "data_offset": 0, 00:20:35.189 "data_size": 65536 00:20:35.189 } 00:20:35.189 ] 00:20:35.189 }' 00:20:35.189 20:34:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:35.189 20:34:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:35.756 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:35.756 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.015 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:36.015 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:36.274 [2024-07-15 20:34:28.418378] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.274 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:36.533 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.533 "name": "Existed_Raid", 00:20:36.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.533 "strip_size_kb": 64, 00:20:36.533 "state": "configuring", 00:20:36.533 "raid_level": "concat", 00:20:36.533 "superblock": false, 00:20:36.533 "num_base_bdevs": 4, 00:20:36.533 "num_base_bdevs_discovered": 2, 00:20:36.533 "num_base_bdevs_operational": 4, 00:20:36.533 "base_bdevs_list": [ 00:20:36.533 { 00:20:36.533 "name": null, 00:20:36.533 "uuid": "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957", 00:20:36.533 "is_configured": false, 00:20:36.533 "data_offset": 0, 00:20:36.533 "data_size": 65536 00:20:36.533 }, 00:20:36.533 { 00:20:36.533 "name": null, 00:20:36.533 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:36.533 "is_configured": false, 00:20:36.533 "data_offset": 0, 00:20:36.533 "data_size": 65536 00:20:36.533 }, 00:20:36.533 { 00:20:36.533 "name": "BaseBdev3", 00:20:36.533 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:36.533 "is_configured": true, 00:20:36.533 "data_offset": 0, 00:20:36.533 "data_size": 65536 00:20:36.533 }, 00:20:36.533 { 00:20:36.533 "name": "BaseBdev4", 00:20:36.533 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:36.533 "is_configured": true, 00:20:36.533 "data_offset": 0, 00:20:36.533 "data_size": 65536 00:20:36.533 } 00:20:36.533 ] 00:20:36.533 }' 00:20:36.533 20:34:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.533 20:34:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:37.101 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:37.101 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.361 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:37.361 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:37.620 [2024-07-15 20:34:29.754313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.620 20:34:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:37.879 20:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:37.879 "name": "Existed_Raid", 00:20:37.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.879 "strip_size_kb": 64, 00:20:37.879 "state": "configuring", 00:20:37.879 "raid_level": "concat", 00:20:37.879 "superblock": false, 00:20:37.879 "num_base_bdevs": 4, 00:20:37.879 "num_base_bdevs_discovered": 3, 00:20:37.879 "num_base_bdevs_operational": 4, 00:20:37.879 "base_bdevs_list": [ 00:20:37.879 { 00:20:37.879 "name": null, 00:20:37.879 "uuid": "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957", 00:20:37.879 "is_configured": false, 00:20:37.879 "data_offset": 0, 00:20:37.879 "data_size": 65536 00:20:37.879 }, 00:20:37.879 { 00:20:37.879 "name": "BaseBdev2", 00:20:37.879 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:37.879 "is_configured": true, 00:20:37.879 "data_offset": 0, 00:20:37.879 "data_size": 65536 00:20:37.879 }, 00:20:37.879 { 00:20:37.879 "name": "BaseBdev3", 00:20:37.879 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:37.879 "is_configured": true, 00:20:37.879 "data_offset": 0, 00:20:37.879 "data_size": 65536 00:20:37.879 }, 00:20:37.879 { 00:20:37.879 "name": "BaseBdev4", 00:20:37.879 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:37.879 "is_configured": true, 00:20:37.879 "data_offset": 0, 00:20:37.879 "data_size": 65536 00:20:37.879 } 00:20:37.879 ] 00:20:37.879 }' 00:20:37.879 20:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:37.879 20:34:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:38.449 20:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:38.449 20:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.709 20:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:38.709 20:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.709 20:34:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:38.968 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2d1cf2c8-b1c0-44a7-a9e0-f7b918073957 00:20:39.228 [2024-07-15 20:34:31.350206] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:39.228 [2024-07-15 20:34:31.350245] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cb8040 00:20:39.228 [2024-07-15 20:34:31.350254] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:39.228 [2024-07-15 20:34:31.350448] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cb3a70 00:20:39.228 [2024-07-15 20:34:31.350567] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cb8040 00:20:39.228 [2024-07-15 20:34:31.350577] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cb8040 00:20:39.228 [2024-07-15 20:34:31.350741] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:39.228 NewBaseBdev 00:20:39.228 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:39.228 20:34:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:39.228 20:34:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:39.228 20:34:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:39.228 20:34:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:39.228 20:34:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:39.228 20:34:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:39.487 20:34:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:39.487 [ 00:20:39.487 { 00:20:39.487 "name": "NewBaseBdev", 00:20:39.487 "aliases": [ 00:20:39.487 "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957" 00:20:39.487 ], 00:20:39.487 "product_name": "Malloc disk", 00:20:39.487 "block_size": 512, 00:20:39.487 "num_blocks": 65536, 00:20:39.487 "uuid": "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957", 00:20:39.487 "assigned_rate_limits": { 00:20:39.487 "rw_ios_per_sec": 0, 00:20:39.487 "rw_mbytes_per_sec": 0, 00:20:39.487 "r_mbytes_per_sec": 0, 00:20:39.487 "w_mbytes_per_sec": 0 00:20:39.487 }, 00:20:39.487 "claimed": true, 00:20:39.487 "claim_type": "exclusive_write", 00:20:39.487 "zoned": false, 00:20:39.487 "supported_io_types": { 00:20:39.487 "read": true, 00:20:39.487 "write": true, 00:20:39.487 "unmap": true, 00:20:39.487 "flush": true, 00:20:39.487 "reset": true, 00:20:39.487 "nvme_admin": false, 00:20:39.487 "nvme_io": false, 00:20:39.487 "nvme_io_md": false, 00:20:39.487 "write_zeroes": true, 00:20:39.487 "zcopy": true, 00:20:39.488 "get_zone_info": false, 00:20:39.488 "zone_management": false, 00:20:39.488 "zone_append": false, 00:20:39.488 "compare": false, 00:20:39.488 "compare_and_write": false, 00:20:39.488 "abort": true, 00:20:39.488 "seek_hole": false, 00:20:39.488 "seek_data": false, 00:20:39.488 "copy": true, 00:20:39.488 "nvme_iov_md": false 00:20:39.488 }, 00:20:39.488 "memory_domains": [ 00:20:39.488 { 00:20:39.488 "dma_device_id": "system", 00:20:39.488 "dma_device_type": 1 00:20:39.488 }, 00:20:39.488 { 00:20:39.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.488 "dma_device_type": 2 00:20:39.488 } 00:20:39.488 ], 00:20:39.488 "driver_specific": {} 00:20:39.488 } 00:20:39.488 ] 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.488 20:34:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:39.747 20:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.747 "name": "Existed_Raid", 00:20:39.747 "uuid": "50b136be-8ebd-4171-ac8a-7ea9cf0ea912", 00:20:39.747 "strip_size_kb": 64, 00:20:39.747 "state": "online", 00:20:39.747 "raid_level": "concat", 00:20:39.747 "superblock": false, 00:20:39.747 "num_base_bdevs": 4, 00:20:39.747 "num_base_bdevs_discovered": 4, 00:20:39.747 "num_base_bdevs_operational": 4, 00:20:39.747 "base_bdevs_list": [ 00:20:39.747 { 00:20:39.747 "name": "NewBaseBdev", 00:20:39.747 "uuid": "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957", 00:20:39.747 "is_configured": true, 00:20:39.747 "data_offset": 0, 00:20:39.747 "data_size": 65536 00:20:39.747 }, 00:20:39.747 { 00:20:39.747 "name": "BaseBdev2", 00:20:39.747 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:39.747 "is_configured": true, 00:20:39.747 "data_offset": 0, 00:20:39.747 "data_size": 65536 00:20:39.747 }, 00:20:39.747 { 00:20:39.747 "name": "BaseBdev3", 00:20:39.747 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:39.747 "is_configured": true, 00:20:39.747 "data_offset": 0, 00:20:39.747 "data_size": 65536 00:20:39.747 }, 00:20:39.747 { 00:20:39.747 "name": "BaseBdev4", 00:20:39.747 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:39.747 "is_configured": true, 00:20:39.747 "data_offset": 0, 00:20:39.747 "data_size": 65536 00:20:39.747 } 00:20:39.747 ] 00:20:39.747 }' 00:20:39.747 20:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.747 20:34:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:40.685 20:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:40.685 20:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:40.685 20:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:40.685 20:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:40.685 20:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:40.685 20:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:40.685 20:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:40.685 20:34:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:40.943 [2024-07-15 20:34:33.235696] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:40.943 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:40.943 "name": "Existed_Raid", 00:20:40.943 "aliases": [ 00:20:40.943 "50b136be-8ebd-4171-ac8a-7ea9cf0ea912" 00:20:40.943 ], 00:20:40.943 "product_name": "Raid Volume", 00:20:40.943 "block_size": 512, 00:20:40.943 "num_blocks": 262144, 00:20:40.943 "uuid": "50b136be-8ebd-4171-ac8a-7ea9cf0ea912", 00:20:40.943 "assigned_rate_limits": { 00:20:40.943 "rw_ios_per_sec": 0, 00:20:40.943 "rw_mbytes_per_sec": 0, 00:20:40.943 "r_mbytes_per_sec": 0, 00:20:40.943 "w_mbytes_per_sec": 0 00:20:40.943 }, 00:20:40.943 "claimed": false, 00:20:40.943 "zoned": false, 00:20:40.943 "supported_io_types": { 00:20:40.943 "read": true, 00:20:40.943 "write": true, 00:20:40.943 "unmap": true, 00:20:40.943 "flush": true, 00:20:40.943 "reset": true, 00:20:40.943 "nvme_admin": false, 00:20:40.943 "nvme_io": false, 00:20:40.943 "nvme_io_md": false, 00:20:40.943 "write_zeroes": true, 00:20:40.943 "zcopy": false, 00:20:40.943 "get_zone_info": false, 00:20:40.943 "zone_management": false, 00:20:40.943 "zone_append": false, 00:20:40.943 "compare": false, 00:20:40.943 "compare_and_write": false, 00:20:40.943 "abort": false, 00:20:40.943 "seek_hole": false, 00:20:40.943 "seek_data": false, 00:20:40.943 "copy": false, 00:20:40.943 "nvme_iov_md": false 00:20:40.943 }, 00:20:40.943 "memory_domains": [ 00:20:40.943 { 00:20:40.943 "dma_device_id": "system", 00:20:40.943 "dma_device_type": 1 00:20:40.943 }, 00:20:40.943 { 00:20:40.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.943 "dma_device_type": 2 00:20:40.943 }, 00:20:40.943 { 00:20:40.943 "dma_device_id": "system", 00:20:40.943 "dma_device_type": 1 00:20:40.943 }, 00:20:40.943 { 00:20:40.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.943 "dma_device_type": 2 00:20:40.944 }, 00:20:40.944 { 00:20:40.944 "dma_device_id": "system", 00:20:40.944 "dma_device_type": 1 00:20:40.944 }, 00:20:40.944 { 00:20:40.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.944 "dma_device_type": 2 00:20:40.944 }, 00:20:40.944 { 00:20:40.944 "dma_device_id": "system", 00:20:40.944 "dma_device_type": 1 00:20:40.944 }, 00:20:40.944 { 00:20:40.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.944 "dma_device_type": 2 00:20:40.944 } 00:20:40.944 ], 00:20:40.944 "driver_specific": { 00:20:40.944 "raid": { 00:20:40.944 "uuid": "50b136be-8ebd-4171-ac8a-7ea9cf0ea912", 00:20:40.944 "strip_size_kb": 64, 00:20:40.944 "state": "online", 00:20:40.944 "raid_level": "concat", 00:20:40.944 "superblock": false, 00:20:40.944 "num_base_bdevs": 4, 00:20:40.944 "num_base_bdevs_discovered": 4, 00:20:40.944 "num_base_bdevs_operational": 4, 00:20:40.944 "base_bdevs_list": [ 00:20:40.944 { 00:20:40.944 "name": "NewBaseBdev", 00:20:40.944 "uuid": "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957", 00:20:40.944 "is_configured": true, 00:20:40.944 "data_offset": 0, 00:20:40.944 "data_size": 65536 00:20:40.944 }, 00:20:40.944 { 00:20:40.944 "name": "BaseBdev2", 00:20:40.944 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:40.944 "is_configured": true, 00:20:40.944 "data_offset": 0, 00:20:40.944 "data_size": 65536 00:20:40.944 }, 00:20:40.944 { 00:20:40.944 "name": "BaseBdev3", 00:20:40.944 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:40.944 "is_configured": true, 00:20:40.944 "data_offset": 0, 00:20:40.944 "data_size": 65536 00:20:40.944 }, 00:20:40.944 { 00:20:40.944 "name": "BaseBdev4", 00:20:40.944 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:40.944 "is_configured": true, 00:20:40.944 "data_offset": 0, 00:20:40.944 "data_size": 65536 00:20:40.944 } 00:20:40.944 ] 00:20:40.944 } 00:20:40.944 } 00:20:40.944 }' 00:20:40.944 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:40.944 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:40.944 BaseBdev2 00:20:40.944 BaseBdev3 00:20:40.944 BaseBdev4' 00:20:40.944 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:40.944 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:40.944 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:41.201 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:41.201 "name": "NewBaseBdev", 00:20:41.201 "aliases": [ 00:20:41.201 "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957" 00:20:41.201 ], 00:20:41.201 "product_name": "Malloc disk", 00:20:41.201 "block_size": 512, 00:20:41.201 "num_blocks": 65536, 00:20:41.201 "uuid": "2d1cf2c8-b1c0-44a7-a9e0-f7b918073957", 00:20:41.201 "assigned_rate_limits": { 00:20:41.201 "rw_ios_per_sec": 0, 00:20:41.201 "rw_mbytes_per_sec": 0, 00:20:41.201 "r_mbytes_per_sec": 0, 00:20:41.201 "w_mbytes_per_sec": 0 00:20:41.201 }, 00:20:41.201 "claimed": true, 00:20:41.201 "claim_type": "exclusive_write", 00:20:41.201 "zoned": false, 00:20:41.201 "supported_io_types": { 00:20:41.201 "read": true, 00:20:41.201 "write": true, 00:20:41.201 "unmap": true, 00:20:41.201 "flush": true, 00:20:41.201 "reset": true, 00:20:41.201 "nvme_admin": false, 00:20:41.201 "nvme_io": false, 00:20:41.201 "nvme_io_md": false, 00:20:41.201 "write_zeroes": true, 00:20:41.201 "zcopy": true, 00:20:41.201 "get_zone_info": false, 00:20:41.201 "zone_management": false, 00:20:41.201 "zone_append": false, 00:20:41.201 "compare": false, 00:20:41.201 "compare_and_write": false, 00:20:41.201 "abort": true, 00:20:41.201 "seek_hole": false, 00:20:41.201 "seek_data": false, 00:20:41.201 "copy": true, 00:20:41.201 "nvme_iov_md": false 00:20:41.201 }, 00:20:41.201 "memory_domains": [ 00:20:41.201 { 00:20:41.201 "dma_device_id": "system", 00:20:41.201 "dma_device_type": 1 00:20:41.201 }, 00:20:41.201 { 00:20:41.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:41.201 "dma_device_type": 2 00:20:41.201 } 00:20:41.201 ], 00:20:41.201 "driver_specific": {} 00:20:41.201 }' 00:20:41.201 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:41.459 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:41.459 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:41.459 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:41.459 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:41.459 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:41.459 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:41.459 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:41.717 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:41.717 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:41.717 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:41.717 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:41.717 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:41.717 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:41.717 20:34:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:41.976 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:41.976 "name": "BaseBdev2", 00:20:41.976 "aliases": [ 00:20:41.976 "28e36554-a60f-4315-af4f-452f3c60d6bf" 00:20:41.976 ], 00:20:41.976 "product_name": "Malloc disk", 00:20:41.976 "block_size": 512, 00:20:41.976 "num_blocks": 65536, 00:20:41.976 "uuid": "28e36554-a60f-4315-af4f-452f3c60d6bf", 00:20:41.976 "assigned_rate_limits": { 00:20:41.976 "rw_ios_per_sec": 0, 00:20:41.976 "rw_mbytes_per_sec": 0, 00:20:41.976 "r_mbytes_per_sec": 0, 00:20:41.976 "w_mbytes_per_sec": 0 00:20:41.976 }, 00:20:41.976 "claimed": true, 00:20:41.976 "claim_type": "exclusive_write", 00:20:41.976 "zoned": false, 00:20:41.976 "supported_io_types": { 00:20:41.976 "read": true, 00:20:41.976 "write": true, 00:20:41.976 "unmap": true, 00:20:41.976 "flush": true, 00:20:41.976 "reset": true, 00:20:41.976 "nvme_admin": false, 00:20:41.976 "nvme_io": false, 00:20:41.976 "nvme_io_md": false, 00:20:41.976 "write_zeroes": true, 00:20:41.976 "zcopy": true, 00:20:41.976 "get_zone_info": false, 00:20:41.976 "zone_management": false, 00:20:41.976 "zone_append": false, 00:20:41.976 "compare": false, 00:20:41.976 "compare_and_write": false, 00:20:41.976 "abort": true, 00:20:41.976 "seek_hole": false, 00:20:41.976 "seek_data": false, 00:20:41.976 "copy": true, 00:20:41.976 "nvme_iov_md": false 00:20:41.976 }, 00:20:41.976 "memory_domains": [ 00:20:41.976 { 00:20:41.976 "dma_device_id": "system", 00:20:41.976 "dma_device_type": 1 00:20:41.976 }, 00:20:41.976 { 00:20:41.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:41.976 "dma_device_type": 2 00:20:41.976 } 00:20:41.976 ], 00:20:41.976 "driver_specific": {} 00:20:41.976 }' 00:20:41.976 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:41.976 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:41.976 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:41.976 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:42.234 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:42.234 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:42.235 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:42.235 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:42.235 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:42.235 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:42.235 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:42.235 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:42.235 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:42.235 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:42.235 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:42.549 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:42.549 "name": "BaseBdev3", 00:20:42.549 "aliases": [ 00:20:42.549 "0298f55f-9100-4f52-8058-2c547ef3a067" 00:20:42.549 ], 00:20:42.549 "product_name": "Malloc disk", 00:20:42.549 "block_size": 512, 00:20:42.549 "num_blocks": 65536, 00:20:42.549 "uuid": "0298f55f-9100-4f52-8058-2c547ef3a067", 00:20:42.549 "assigned_rate_limits": { 00:20:42.549 "rw_ios_per_sec": 0, 00:20:42.549 "rw_mbytes_per_sec": 0, 00:20:42.549 "r_mbytes_per_sec": 0, 00:20:42.549 "w_mbytes_per_sec": 0 00:20:42.549 }, 00:20:42.549 "claimed": true, 00:20:42.549 "claim_type": "exclusive_write", 00:20:42.549 "zoned": false, 00:20:42.549 "supported_io_types": { 00:20:42.549 "read": true, 00:20:42.549 "write": true, 00:20:42.549 "unmap": true, 00:20:42.549 "flush": true, 00:20:42.549 "reset": true, 00:20:42.549 "nvme_admin": false, 00:20:42.549 "nvme_io": false, 00:20:42.549 "nvme_io_md": false, 00:20:42.549 "write_zeroes": true, 00:20:42.549 "zcopy": true, 00:20:42.549 "get_zone_info": false, 00:20:42.549 "zone_management": false, 00:20:42.549 "zone_append": false, 00:20:42.549 "compare": false, 00:20:42.549 "compare_and_write": false, 00:20:42.549 "abort": true, 00:20:42.549 "seek_hole": false, 00:20:42.549 "seek_data": false, 00:20:42.549 "copy": true, 00:20:42.549 "nvme_iov_md": false 00:20:42.549 }, 00:20:42.549 "memory_domains": [ 00:20:42.549 { 00:20:42.549 "dma_device_id": "system", 00:20:42.549 "dma_device_type": 1 00:20:42.549 }, 00:20:42.549 { 00:20:42.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.549 "dma_device_type": 2 00:20:42.549 } 00:20:42.549 ], 00:20:42.549 "driver_specific": {} 00:20:42.549 }' 00:20:42.549 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:42.807 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:42.807 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:42.807 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:42.807 20:34:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:42.807 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:42.807 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:42.807 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:42.807 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:42.807 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:42.807 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.065 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:43.065 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:43.065 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:43.065 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:43.065 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:43.065 "name": "BaseBdev4", 00:20:43.065 "aliases": [ 00:20:43.065 "4b975e4b-3a57-4ab7-9621-d4998be37dbe" 00:20:43.065 ], 00:20:43.065 "product_name": "Malloc disk", 00:20:43.065 "block_size": 512, 00:20:43.065 "num_blocks": 65536, 00:20:43.065 "uuid": "4b975e4b-3a57-4ab7-9621-d4998be37dbe", 00:20:43.065 "assigned_rate_limits": { 00:20:43.065 "rw_ios_per_sec": 0, 00:20:43.065 "rw_mbytes_per_sec": 0, 00:20:43.065 "r_mbytes_per_sec": 0, 00:20:43.065 "w_mbytes_per_sec": 0 00:20:43.065 }, 00:20:43.065 "claimed": true, 00:20:43.065 "claim_type": "exclusive_write", 00:20:43.065 "zoned": false, 00:20:43.065 "supported_io_types": { 00:20:43.065 "read": true, 00:20:43.065 "write": true, 00:20:43.065 "unmap": true, 00:20:43.065 "flush": true, 00:20:43.065 "reset": true, 00:20:43.065 "nvme_admin": false, 00:20:43.065 "nvme_io": false, 00:20:43.065 "nvme_io_md": false, 00:20:43.065 "write_zeroes": true, 00:20:43.065 "zcopy": true, 00:20:43.065 "get_zone_info": false, 00:20:43.065 "zone_management": false, 00:20:43.065 "zone_append": false, 00:20:43.065 "compare": false, 00:20:43.065 "compare_and_write": false, 00:20:43.065 "abort": true, 00:20:43.065 "seek_hole": false, 00:20:43.065 "seek_data": false, 00:20:43.065 "copy": true, 00:20:43.065 "nvme_iov_md": false 00:20:43.065 }, 00:20:43.065 "memory_domains": [ 00:20:43.065 { 00:20:43.065 "dma_device_id": "system", 00:20:43.065 "dma_device_type": 1 00:20:43.065 }, 00:20:43.065 { 00:20:43.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.065 "dma_device_type": 2 00:20:43.065 } 00:20:43.065 ], 00:20:43.065 "driver_specific": {} 00:20:43.065 }' 00:20:43.065 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.323 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.323 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:43.323 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.323 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.323 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:43.323 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.582 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.582 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:43.582 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.582 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.582 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:43.582 20:34:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:43.841 [2024-07-15 20:34:36.074949] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:43.841 [2024-07-15 20:34:36.074977] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:43.841 [2024-07-15 20:34:36.075029] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:43.841 [2024-07-15 20:34:36.075091] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:43.841 [2024-07-15 20:34:36.075104] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cb8040 name Existed_Raid, state offline 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1428552 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1428552 ']' 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1428552 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1428552 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1428552' 00:20:43.841 killing process with pid 1428552 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1428552 00:20:43.841 [2024-07-15 20:34:36.149731] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:43.841 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1428552 00:20:43.841 [2024-07-15 20:34:36.187189] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:44.100 20:34:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:44.100 00:20:44.100 real 0m33.601s 00:20:44.100 user 1m1.643s 00:20:44.100 sys 0m5.982s 00:20:44.100 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:44.100 20:34:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:44.100 ************************************ 00:20:44.100 END TEST raid_state_function_test 00:20:44.100 ************************************ 00:20:44.100 20:34:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:44.100 20:34:36 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:20:44.100 20:34:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:44.100 20:34:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:44.100 20:34:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:44.360 ************************************ 00:20:44.360 START TEST raid_state_function_test_sb 00:20:44.360 ************************************ 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1433449 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1433449' 00:20:44.360 Process raid pid: 1433449 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1433449 /var/tmp/spdk-raid.sock 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1433449 ']' 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:44.360 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:44.360 20:34:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:44.360 [2024-07-15 20:34:36.577264] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:20:44.360 [2024-07-15 20:34:36.577336] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:44.360 [2024-07-15 20:34:36.712392] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:44.619 [2024-07-15 20:34:36.815603] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:44.619 [2024-07-15 20:34:36.882583] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:44.619 [2024-07-15 20:34:36.882623] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:45.222 20:34:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:45.222 20:34:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:45.222 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:45.485 [2024-07-15 20:34:37.738332] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:45.486 [2024-07-15 20:34:37.738377] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:45.486 [2024-07-15 20:34:37.738388] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:45.486 [2024-07-15 20:34:37.738400] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:45.486 [2024-07-15 20:34:37.738408] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:45.486 [2024-07-15 20:34:37.738419] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:45.486 [2024-07-15 20:34:37.738428] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:45.486 [2024-07-15 20:34:37.738439] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.486 20:34:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:45.747 20:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.747 "name": "Existed_Raid", 00:20:45.747 "uuid": "6f7d901a-0be5-4658-8d2c-cf83c925fdb9", 00:20:45.747 "strip_size_kb": 64, 00:20:45.747 "state": "configuring", 00:20:45.747 "raid_level": "concat", 00:20:45.747 "superblock": true, 00:20:45.747 "num_base_bdevs": 4, 00:20:45.747 "num_base_bdevs_discovered": 0, 00:20:45.747 "num_base_bdevs_operational": 4, 00:20:45.747 "base_bdevs_list": [ 00:20:45.747 { 00:20:45.747 "name": "BaseBdev1", 00:20:45.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.747 "is_configured": false, 00:20:45.747 "data_offset": 0, 00:20:45.747 "data_size": 0 00:20:45.747 }, 00:20:45.747 { 00:20:45.747 "name": "BaseBdev2", 00:20:45.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.747 "is_configured": false, 00:20:45.747 "data_offset": 0, 00:20:45.747 "data_size": 0 00:20:45.747 }, 00:20:45.747 { 00:20:45.747 "name": "BaseBdev3", 00:20:45.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.747 "is_configured": false, 00:20:45.747 "data_offset": 0, 00:20:45.747 "data_size": 0 00:20:45.747 }, 00:20:45.747 { 00:20:45.747 "name": "BaseBdev4", 00:20:45.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.748 "is_configured": false, 00:20:45.748 "data_offset": 0, 00:20:45.748 "data_size": 0 00:20:45.748 } 00:20:45.748 ] 00:20:45.748 }' 00:20:45.748 20:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.748 20:34:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:46.315 20:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:46.574 [2024-07-15 20:34:38.845103] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:46.574 [2024-07-15 20:34:38.845135] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10b6aa0 name Existed_Raid, state configuring 00:20:46.574 20:34:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:46.833 [2024-07-15 20:34:39.093789] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:46.833 [2024-07-15 20:34:39.093818] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:46.833 [2024-07-15 20:34:39.093828] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:46.833 [2024-07-15 20:34:39.093839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:46.833 [2024-07-15 20:34:39.093848] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:46.833 [2024-07-15 20:34:39.093859] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:46.833 [2024-07-15 20:34:39.093867] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:46.833 [2024-07-15 20:34:39.093878] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:46.833 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:47.093 [2024-07-15 20:34:39.352246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:47.093 BaseBdev1 00:20:47.093 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:47.093 20:34:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:47.093 20:34:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:47.093 20:34:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:47.093 20:34:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:47.093 20:34:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:47.093 20:34:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:47.352 20:34:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:47.611 [ 00:20:47.611 { 00:20:47.611 "name": "BaseBdev1", 00:20:47.611 "aliases": [ 00:20:47.611 "e441f4a6-5a9d-413a-a742-efb079441b66" 00:20:47.611 ], 00:20:47.611 "product_name": "Malloc disk", 00:20:47.611 "block_size": 512, 00:20:47.611 "num_blocks": 65536, 00:20:47.611 "uuid": "e441f4a6-5a9d-413a-a742-efb079441b66", 00:20:47.611 "assigned_rate_limits": { 00:20:47.611 "rw_ios_per_sec": 0, 00:20:47.611 "rw_mbytes_per_sec": 0, 00:20:47.611 "r_mbytes_per_sec": 0, 00:20:47.611 "w_mbytes_per_sec": 0 00:20:47.611 }, 00:20:47.611 "claimed": true, 00:20:47.611 "claim_type": "exclusive_write", 00:20:47.611 "zoned": false, 00:20:47.611 "supported_io_types": { 00:20:47.611 "read": true, 00:20:47.611 "write": true, 00:20:47.611 "unmap": true, 00:20:47.611 "flush": true, 00:20:47.611 "reset": true, 00:20:47.611 "nvme_admin": false, 00:20:47.611 "nvme_io": false, 00:20:47.611 "nvme_io_md": false, 00:20:47.611 "write_zeroes": true, 00:20:47.611 "zcopy": true, 00:20:47.611 "get_zone_info": false, 00:20:47.611 "zone_management": false, 00:20:47.611 "zone_append": false, 00:20:47.611 "compare": false, 00:20:47.611 "compare_and_write": false, 00:20:47.611 "abort": true, 00:20:47.611 "seek_hole": false, 00:20:47.611 "seek_data": false, 00:20:47.611 "copy": true, 00:20:47.611 "nvme_iov_md": false 00:20:47.611 }, 00:20:47.611 "memory_domains": [ 00:20:47.611 { 00:20:47.611 "dma_device_id": "system", 00:20:47.611 "dma_device_type": 1 00:20:47.611 }, 00:20:47.611 { 00:20:47.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:47.611 "dma_device_type": 2 00:20:47.611 } 00:20:47.611 ], 00:20:47.611 "driver_specific": {} 00:20:47.611 } 00:20:47.611 ] 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.611 20:34:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:47.871 20:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.871 "name": "Existed_Raid", 00:20:47.871 "uuid": "9155bb22-af28-4d74-a89b-72c36ecca09e", 00:20:47.871 "strip_size_kb": 64, 00:20:47.871 "state": "configuring", 00:20:47.871 "raid_level": "concat", 00:20:47.871 "superblock": true, 00:20:47.871 "num_base_bdevs": 4, 00:20:47.871 "num_base_bdevs_discovered": 1, 00:20:47.871 "num_base_bdevs_operational": 4, 00:20:47.871 "base_bdevs_list": [ 00:20:47.871 { 00:20:47.871 "name": "BaseBdev1", 00:20:47.871 "uuid": "e441f4a6-5a9d-413a-a742-efb079441b66", 00:20:47.871 "is_configured": true, 00:20:47.871 "data_offset": 2048, 00:20:47.871 "data_size": 63488 00:20:47.871 }, 00:20:47.871 { 00:20:47.871 "name": "BaseBdev2", 00:20:47.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.871 "is_configured": false, 00:20:47.871 "data_offset": 0, 00:20:47.871 "data_size": 0 00:20:47.871 }, 00:20:47.871 { 00:20:47.871 "name": "BaseBdev3", 00:20:47.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.871 "is_configured": false, 00:20:47.871 "data_offset": 0, 00:20:47.871 "data_size": 0 00:20:47.871 }, 00:20:47.871 { 00:20:47.871 "name": "BaseBdev4", 00:20:47.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.871 "is_configured": false, 00:20:47.871 "data_offset": 0, 00:20:47.871 "data_size": 0 00:20:47.871 } 00:20:47.871 ] 00:20:47.871 }' 00:20:47.871 20:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.871 20:34:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:48.807 20:34:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:48.807 [2024-07-15 20:34:41.149022] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:48.807 [2024-07-15 20:34:41.149071] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10b6310 name Existed_Raid, state configuring 00:20:48.807 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:49.067 [2024-07-15 20:34:41.393721] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:49.067 [2024-07-15 20:34:41.395236] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:49.067 [2024-07-15 20:34:41.395272] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:49.067 [2024-07-15 20:34:41.395283] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:49.067 [2024-07-15 20:34:41.395295] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:49.067 [2024-07-15 20:34:41.395304] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:49.067 [2024-07-15 20:34:41.395315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.067 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:49.635 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:49.635 "name": "Existed_Raid", 00:20:49.635 "uuid": "06cc5dae-185f-4539-bfc7-195309dd9058", 00:20:49.635 "strip_size_kb": 64, 00:20:49.635 "state": "configuring", 00:20:49.635 "raid_level": "concat", 00:20:49.635 "superblock": true, 00:20:49.635 "num_base_bdevs": 4, 00:20:49.635 "num_base_bdevs_discovered": 1, 00:20:49.635 "num_base_bdevs_operational": 4, 00:20:49.635 "base_bdevs_list": [ 00:20:49.635 { 00:20:49.635 "name": "BaseBdev1", 00:20:49.635 "uuid": "e441f4a6-5a9d-413a-a742-efb079441b66", 00:20:49.635 "is_configured": true, 00:20:49.635 "data_offset": 2048, 00:20:49.635 "data_size": 63488 00:20:49.635 }, 00:20:49.635 { 00:20:49.635 "name": "BaseBdev2", 00:20:49.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.635 "is_configured": false, 00:20:49.635 "data_offset": 0, 00:20:49.635 "data_size": 0 00:20:49.635 }, 00:20:49.635 { 00:20:49.635 "name": "BaseBdev3", 00:20:49.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.635 "is_configured": false, 00:20:49.635 "data_offset": 0, 00:20:49.635 "data_size": 0 00:20:49.635 }, 00:20:49.635 { 00:20:49.635 "name": "BaseBdev4", 00:20:49.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.635 "is_configured": false, 00:20:49.635 "data_offset": 0, 00:20:49.635 "data_size": 0 00:20:49.635 } 00:20:49.635 ] 00:20:49.635 }' 00:20:49.635 20:34:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:49.635 20:34:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:50.573 20:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:50.573 [2024-07-15 20:34:42.902469] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:50.573 BaseBdev2 00:20:50.573 20:34:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:50.573 20:34:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:50.573 20:34:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:50.573 20:34:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:50.573 20:34:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:50.573 20:34:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:50.573 20:34:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:50.832 20:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:51.094 [ 00:20:51.094 { 00:20:51.094 "name": "BaseBdev2", 00:20:51.094 "aliases": [ 00:20:51.094 "8170c51b-c25e-41d3-8b85-ac3f5350a2a8" 00:20:51.094 ], 00:20:51.094 "product_name": "Malloc disk", 00:20:51.094 "block_size": 512, 00:20:51.094 "num_blocks": 65536, 00:20:51.094 "uuid": "8170c51b-c25e-41d3-8b85-ac3f5350a2a8", 00:20:51.094 "assigned_rate_limits": { 00:20:51.094 "rw_ios_per_sec": 0, 00:20:51.094 "rw_mbytes_per_sec": 0, 00:20:51.094 "r_mbytes_per_sec": 0, 00:20:51.094 "w_mbytes_per_sec": 0 00:20:51.094 }, 00:20:51.094 "claimed": true, 00:20:51.094 "claim_type": "exclusive_write", 00:20:51.094 "zoned": false, 00:20:51.094 "supported_io_types": { 00:20:51.094 "read": true, 00:20:51.094 "write": true, 00:20:51.094 "unmap": true, 00:20:51.094 "flush": true, 00:20:51.094 "reset": true, 00:20:51.094 "nvme_admin": false, 00:20:51.094 "nvme_io": false, 00:20:51.094 "nvme_io_md": false, 00:20:51.094 "write_zeroes": true, 00:20:51.094 "zcopy": true, 00:20:51.094 "get_zone_info": false, 00:20:51.094 "zone_management": false, 00:20:51.094 "zone_append": false, 00:20:51.094 "compare": false, 00:20:51.094 "compare_and_write": false, 00:20:51.094 "abort": true, 00:20:51.094 "seek_hole": false, 00:20:51.094 "seek_data": false, 00:20:51.094 "copy": true, 00:20:51.094 "nvme_iov_md": false 00:20:51.094 }, 00:20:51.094 "memory_domains": [ 00:20:51.094 { 00:20:51.094 "dma_device_id": "system", 00:20:51.094 "dma_device_type": 1 00:20:51.094 }, 00:20:51.094 { 00:20:51.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.094 "dma_device_type": 2 00:20:51.094 } 00:20:51.094 ], 00:20:51.094 "driver_specific": {} 00:20:51.094 } 00:20:51.094 ] 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.094 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:51.354 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:51.354 "name": "Existed_Raid", 00:20:51.354 "uuid": "06cc5dae-185f-4539-bfc7-195309dd9058", 00:20:51.354 "strip_size_kb": 64, 00:20:51.354 "state": "configuring", 00:20:51.354 "raid_level": "concat", 00:20:51.354 "superblock": true, 00:20:51.354 "num_base_bdevs": 4, 00:20:51.354 "num_base_bdevs_discovered": 2, 00:20:51.354 "num_base_bdevs_operational": 4, 00:20:51.354 "base_bdevs_list": [ 00:20:51.354 { 00:20:51.354 "name": "BaseBdev1", 00:20:51.354 "uuid": "e441f4a6-5a9d-413a-a742-efb079441b66", 00:20:51.354 "is_configured": true, 00:20:51.354 "data_offset": 2048, 00:20:51.354 "data_size": 63488 00:20:51.354 }, 00:20:51.354 { 00:20:51.354 "name": "BaseBdev2", 00:20:51.354 "uuid": "8170c51b-c25e-41d3-8b85-ac3f5350a2a8", 00:20:51.354 "is_configured": true, 00:20:51.354 "data_offset": 2048, 00:20:51.354 "data_size": 63488 00:20:51.354 }, 00:20:51.354 { 00:20:51.354 "name": "BaseBdev3", 00:20:51.354 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:51.354 "is_configured": false, 00:20:51.354 "data_offset": 0, 00:20:51.354 "data_size": 0 00:20:51.354 }, 00:20:51.354 { 00:20:51.354 "name": "BaseBdev4", 00:20:51.354 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:51.354 "is_configured": false, 00:20:51.354 "data_offset": 0, 00:20:51.354 "data_size": 0 00:20:51.354 } 00:20:51.354 ] 00:20:51.354 }' 00:20:51.354 20:34:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:51.354 20:34:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:52.290 20:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:52.549 [2024-07-15 20:34:44.766818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:52.549 BaseBdev3 00:20:52.549 20:34:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:52.549 20:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:52.549 20:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:52.549 20:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:52.549 20:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:52.549 20:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:52.549 20:34:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:52.809 20:34:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:53.068 [ 00:20:53.068 { 00:20:53.068 "name": "BaseBdev3", 00:20:53.068 "aliases": [ 00:20:53.068 "61b29014-f552-402c-82b2-a5e9c906a19f" 00:20:53.068 ], 00:20:53.068 "product_name": "Malloc disk", 00:20:53.068 "block_size": 512, 00:20:53.068 "num_blocks": 65536, 00:20:53.068 "uuid": "61b29014-f552-402c-82b2-a5e9c906a19f", 00:20:53.068 "assigned_rate_limits": { 00:20:53.068 "rw_ios_per_sec": 0, 00:20:53.068 "rw_mbytes_per_sec": 0, 00:20:53.068 "r_mbytes_per_sec": 0, 00:20:53.068 "w_mbytes_per_sec": 0 00:20:53.068 }, 00:20:53.068 "claimed": true, 00:20:53.068 "claim_type": "exclusive_write", 00:20:53.068 "zoned": false, 00:20:53.068 "supported_io_types": { 00:20:53.068 "read": true, 00:20:53.068 "write": true, 00:20:53.068 "unmap": true, 00:20:53.068 "flush": true, 00:20:53.068 "reset": true, 00:20:53.068 "nvme_admin": false, 00:20:53.068 "nvme_io": false, 00:20:53.068 "nvme_io_md": false, 00:20:53.068 "write_zeroes": true, 00:20:53.068 "zcopy": true, 00:20:53.068 "get_zone_info": false, 00:20:53.068 "zone_management": false, 00:20:53.068 "zone_append": false, 00:20:53.068 "compare": false, 00:20:53.068 "compare_and_write": false, 00:20:53.068 "abort": true, 00:20:53.068 "seek_hole": false, 00:20:53.068 "seek_data": false, 00:20:53.068 "copy": true, 00:20:53.068 "nvme_iov_md": false 00:20:53.068 }, 00:20:53.068 "memory_domains": [ 00:20:53.068 { 00:20:53.068 "dma_device_id": "system", 00:20:53.068 "dma_device_type": 1 00:20:53.068 }, 00:20:53.068 { 00:20:53.068 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.068 "dma_device_type": 2 00:20:53.068 } 00:20:53.068 ], 00:20:53.068 "driver_specific": {} 00:20:53.068 } 00:20:53.068 ] 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.068 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.069 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.069 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.069 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.069 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:53.328 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.328 "name": "Existed_Raid", 00:20:53.328 "uuid": "06cc5dae-185f-4539-bfc7-195309dd9058", 00:20:53.328 "strip_size_kb": 64, 00:20:53.328 "state": "configuring", 00:20:53.328 "raid_level": "concat", 00:20:53.328 "superblock": true, 00:20:53.328 "num_base_bdevs": 4, 00:20:53.328 "num_base_bdevs_discovered": 3, 00:20:53.328 "num_base_bdevs_operational": 4, 00:20:53.328 "base_bdevs_list": [ 00:20:53.328 { 00:20:53.328 "name": "BaseBdev1", 00:20:53.328 "uuid": "e441f4a6-5a9d-413a-a742-efb079441b66", 00:20:53.328 "is_configured": true, 00:20:53.328 "data_offset": 2048, 00:20:53.328 "data_size": 63488 00:20:53.328 }, 00:20:53.328 { 00:20:53.328 "name": "BaseBdev2", 00:20:53.328 "uuid": "8170c51b-c25e-41d3-8b85-ac3f5350a2a8", 00:20:53.328 "is_configured": true, 00:20:53.328 "data_offset": 2048, 00:20:53.328 "data_size": 63488 00:20:53.328 }, 00:20:53.328 { 00:20:53.328 "name": "BaseBdev3", 00:20:53.328 "uuid": "61b29014-f552-402c-82b2-a5e9c906a19f", 00:20:53.328 "is_configured": true, 00:20:53.328 "data_offset": 2048, 00:20:53.328 "data_size": 63488 00:20:53.328 }, 00:20:53.328 { 00:20:53.328 "name": "BaseBdev4", 00:20:53.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.328 "is_configured": false, 00:20:53.328 "data_offset": 0, 00:20:53.328 "data_size": 0 00:20:53.328 } 00:20:53.328 ] 00:20:53.328 }' 00:20:53.328 20:34:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.328 20:34:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:53.896 20:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:54.463 [2024-07-15 20:34:46.623207] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:54.463 [2024-07-15 20:34:46.623391] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10b7350 00:20:54.463 [2024-07-15 20:34:46.623406] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:54.463 [2024-07-15 20:34:46.623585] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10b7020 00:20:54.463 [2024-07-15 20:34:46.623703] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10b7350 00:20:54.463 [2024-07-15 20:34:46.623715] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10b7350 00:20:54.463 [2024-07-15 20:34:46.623810] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:54.463 BaseBdev4 00:20:54.463 20:34:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:54.463 20:34:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:54.463 20:34:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:54.463 20:34:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:54.463 20:34:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:54.463 20:34:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:54.463 20:34:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:55.031 20:34:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:55.290 [ 00:20:55.290 { 00:20:55.290 "name": "BaseBdev4", 00:20:55.290 "aliases": [ 00:20:55.290 "87d46065-4281-493a-8c9e-229f34a7aab2" 00:20:55.290 ], 00:20:55.290 "product_name": "Malloc disk", 00:20:55.290 "block_size": 512, 00:20:55.290 "num_blocks": 65536, 00:20:55.290 "uuid": "87d46065-4281-493a-8c9e-229f34a7aab2", 00:20:55.290 "assigned_rate_limits": { 00:20:55.290 "rw_ios_per_sec": 0, 00:20:55.290 "rw_mbytes_per_sec": 0, 00:20:55.290 "r_mbytes_per_sec": 0, 00:20:55.290 "w_mbytes_per_sec": 0 00:20:55.290 }, 00:20:55.290 "claimed": true, 00:20:55.290 "claim_type": "exclusive_write", 00:20:55.290 "zoned": false, 00:20:55.290 "supported_io_types": { 00:20:55.290 "read": true, 00:20:55.290 "write": true, 00:20:55.290 "unmap": true, 00:20:55.290 "flush": true, 00:20:55.290 "reset": true, 00:20:55.290 "nvme_admin": false, 00:20:55.290 "nvme_io": false, 00:20:55.290 "nvme_io_md": false, 00:20:55.290 "write_zeroes": true, 00:20:55.290 "zcopy": true, 00:20:55.290 "get_zone_info": false, 00:20:55.290 "zone_management": false, 00:20:55.290 "zone_append": false, 00:20:55.290 "compare": false, 00:20:55.290 "compare_and_write": false, 00:20:55.290 "abort": true, 00:20:55.290 "seek_hole": false, 00:20:55.290 "seek_data": false, 00:20:55.290 "copy": true, 00:20:55.290 "nvme_iov_md": false 00:20:55.291 }, 00:20:55.291 "memory_domains": [ 00:20:55.291 { 00:20:55.291 "dma_device_id": "system", 00:20:55.291 "dma_device_type": 1 00:20:55.291 }, 00:20:55.291 { 00:20:55.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:55.291 "dma_device_type": 2 00:20:55.291 } 00:20:55.291 ], 00:20:55.291 "driver_specific": {} 00:20:55.291 } 00:20:55.291 ] 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.550 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:55.810 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.810 "name": "Existed_Raid", 00:20:55.810 "uuid": "06cc5dae-185f-4539-bfc7-195309dd9058", 00:20:55.810 "strip_size_kb": 64, 00:20:55.810 "state": "online", 00:20:55.810 "raid_level": "concat", 00:20:55.810 "superblock": true, 00:20:55.810 "num_base_bdevs": 4, 00:20:55.810 "num_base_bdevs_discovered": 4, 00:20:55.810 "num_base_bdevs_operational": 4, 00:20:55.810 "base_bdevs_list": [ 00:20:55.810 { 00:20:55.810 "name": "BaseBdev1", 00:20:55.810 "uuid": "e441f4a6-5a9d-413a-a742-efb079441b66", 00:20:55.810 "is_configured": true, 00:20:55.810 "data_offset": 2048, 00:20:55.810 "data_size": 63488 00:20:55.810 }, 00:20:55.810 { 00:20:55.810 "name": "BaseBdev2", 00:20:55.810 "uuid": "8170c51b-c25e-41d3-8b85-ac3f5350a2a8", 00:20:55.810 "is_configured": true, 00:20:55.810 "data_offset": 2048, 00:20:55.810 "data_size": 63488 00:20:55.810 }, 00:20:55.810 { 00:20:55.810 "name": "BaseBdev3", 00:20:55.810 "uuid": "61b29014-f552-402c-82b2-a5e9c906a19f", 00:20:55.810 "is_configured": true, 00:20:55.810 "data_offset": 2048, 00:20:55.810 "data_size": 63488 00:20:55.810 }, 00:20:55.810 { 00:20:55.810 "name": "BaseBdev4", 00:20:55.810 "uuid": "87d46065-4281-493a-8c9e-229f34a7aab2", 00:20:55.810 "is_configured": true, 00:20:55.810 "data_offset": 2048, 00:20:55.810 "data_size": 63488 00:20:55.810 } 00:20:55.810 ] 00:20:55.810 }' 00:20:55.810 20:34:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.810 20:34:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:56.748 20:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:56.748 20:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:56.748 20:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:56.748 20:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:56.748 20:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:56.748 20:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:56.748 20:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:56.748 20:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:56.748 [2024-07-15 20:34:48.929653] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:56.748 20:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:56.748 "name": "Existed_Raid", 00:20:56.748 "aliases": [ 00:20:56.748 "06cc5dae-185f-4539-bfc7-195309dd9058" 00:20:56.748 ], 00:20:56.748 "product_name": "Raid Volume", 00:20:56.748 "block_size": 512, 00:20:56.748 "num_blocks": 253952, 00:20:56.748 "uuid": "06cc5dae-185f-4539-bfc7-195309dd9058", 00:20:56.748 "assigned_rate_limits": { 00:20:56.748 "rw_ios_per_sec": 0, 00:20:56.748 "rw_mbytes_per_sec": 0, 00:20:56.748 "r_mbytes_per_sec": 0, 00:20:56.748 "w_mbytes_per_sec": 0 00:20:56.748 }, 00:20:56.748 "claimed": false, 00:20:56.748 "zoned": false, 00:20:56.748 "supported_io_types": { 00:20:56.748 "read": true, 00:20:56.748 "write": true, 00:20:56.748 "unmap": true, 00:20:56.748 "flush": true, 00:20:56.748 "reset": true, 00:20:56.748 "nvme_admin": false, 00:20:56.748 "nvme_io": false, 00:20:56.748 "nvme_io_md": false, 00:20:56.748 "write_zeroes": true, 00:20:56.748 "zcopy": false, 00:20:56.748 "get_zone_info": false, 00:20:56.748 "zone_management": false, 00:20:56.748 "zone_append": false, 00:20:56.748 "compare": false, 00:20:56.748 "compare_and_write": false, 00:20:56.748 "abort": false, 00:20:56.748 "seek_hole": false, 00:20:56.748 "seek_data": false, 00:20:56.748 "copy": false, 00:20:56.748 "nvme_iov_md": false 00:20:56.748 }, 00:20:56.748 "memory_domains": [ 00:20:56.748 { 00:20:56.748 "dma_device_id": "system", 00:20:56.748 "dma_device_type": 1 00:20:56.748 }, 00:20:56.748 { 00:20:56.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.748 "dma_device_type": 2 00:20:56.748 }, 00:20:56.748 { 00:20:56.748 "dma_device_id": "system", 00:20:56.748 "dma_device_type": 1 00:20:56.748 }, 00:20:56.748 { 00:20:56.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.748 "dma_device_type": 2 00:20:56.748 }, 00:20:56.748 { 00:20:56.748 "dma_device_id": "system", 00:20:56.748 "dma_device_type": 1 00:20:56.748 }, 00:20:56.748 { 00:20:56.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.748 "dma_device_type": 2 00:20:56.748 }, 00:20:56.748 { 00:20:56.748 "dma_device_id": "system", 00:20:56.748 "dma_device_type": 1 00:20:56.748 }, 00:20:56.748 { 00:20:56.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.748 "dma_device_type": 2 00:20:56.748 } 00:20:56.748 ], 00:20:56.748 "driver_specific": { 00:20:56.748 "raid": { 00:20:56.748 "uuid": "06cc5dae-185f-4539-bfc7-195309dd9058", 00:20:56.748 "strip_size_kb": 64, 00:20:56.748 "state": "online", 00:20:56.748 "raid_level": "concat", 00:20:56.748 "superblock": true, 00:20:56.748 "num_base_bdevs": 4, 00:20:56.748 "num_base_bdevs_discovered": 4, 00:20:56.748 "num_base_bdevs_operational": 4, 00:20:56.748 "base_bdevs_list": [ 00:20:56.748 { 00:20:56.748 "name": "BaseBdev1", 00:20:56.748 "uuid": "e441f4a6-5a9d-413a-a742-efb079441b66", 00:20:56.748 "is_configured": true, 00:20:56.748 "data_offset": 2048, 00:20:56.748 "data_size": 63488 00:20:56.748 }, 00:20:56.748 { 00:20:56.748 "name": "BaseBdev2", 00:20:56.748 "uuid": "8170c51b-c25e-41d3-8b85-ac3f5350a2a8", 00:20:56.748 "is_configured": true, 00:20:56.748 "data_offset": 2048, 00:20:56.748 "data_size": 63488 00:20:56.748 }, 00:20:56.748 { 00:20:56.748 "name": "BaseBdev3", 00:20:56.748 "uuid": "61b29014-f552-402c-82b2-a5e9c906a19f", 00:20:56.748 "is_configured": true, 00:20:56.748 "data_offset": 2048, 00:20:56.748 "data_size": 63488 00:20:56.748 }, 00:20:56.748 { 00:20:56.748 "name": "BaseBdev4", 00:20:56.748 "uuid": "87d46065-4281-493a-8c9e-229f34a7aab2", 00:20:56.748 "is_configured": true, 00:20:56.748 "data_offset": 2048, 00:20:56.748 "data_size": 63488 00:20:56.748 } 00:20:56.748 ] 00:20:56.748 } 00:20:56.748 } 00:20:56.748 }' 00:20:56.748 20:34:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:56.748 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:56.748 BaseBdev2 00:20:56.748 BaseBdev3 00:20:56.748 BaseBdev4' 00:20:56.748 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:56.748 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:56.748 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:57.317 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:57.317 "name": "BaseBdev1", 00:20:57.317 "aliases": [ 00:20:57.317 "e441f4a6-5a9d-413a-a742-efb079441b66" 00:20:57.317 ], 00:20:57.317 "product_name": "Malloc disk", 00:20:57.317 "block_size": 512, 00:20:57.317 "num_blocks": 65536, 00:20:57.317 "uuid": "e441f4a6-5a9d-413a-a742-efb079441b66", 00:20:57.317 "assigned_rate_limits": { 00:20:57.317 "rw_ios_per_sec": 0, 00:20:57.317 "rw_mbytes_per_sec": 0, 00:20:57.317 "r_mbytes_per_sec": 0, 00:20:57.317 "w_mbytes_per_sec": 0 00:20:57.317 }, 00:20:57.317 "claimed": true, 00:20:57.317 "claim_type": "exclusive_write", 00:20:57.317 "zoned": false, 00:20:57.317 "supported_io_types": { 00:20:57.317 "read": true, 00:20:57.317 "write": true, 00:20:57.317 "unmap": true, 00:20:57.317 "flush": true, 00:20:57.317 "reset": true, 00:20:57.317 "nvme_admin": false, 00:20:57.317 "nvme_io": false, 00:20:57.317 "nvme_io_md": false, 00:20:57.317 "write_zeroes": true, 00:20:57.317 "zcopy": true, 00:20:57.317 "get_zone_info": false, 00:20:57.317 "zone_management": false, 00:20:57.317 "zone_append": false, 00:20:57.317 "compare": false, 00:20:57.317 "compare_and_write": false, 00:20:57.317 "abort": true, 00:20:57.317 "seek_hole": false, 00:20:57.317 "seek_data": false, 00:20:57.317 "copy": true, 00:20:57.317 "nvme_iov_md": false 00:20:57.317 }, 00:20:57.317 "memory_domains": [ 00:20:57.317 { 00:20:57.317 "dma_device_id": "system", 00:20:57.317 "dma_device_type": 1 00:20:57.317 }, 00:20:57.317 { 00:20:57.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:57.317 "dma_device_type": 2 00:20:57.317 } 00:20:57.317 ], 00:20:57.317 "driver_specific": {} 00:20:57.317 }' 00:20:57.317 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:57.317 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:57.317 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:57.317 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:57.576 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:57.576 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:57.576 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:57.576 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:57.835 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:57.835 20:34:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:57.835 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:57.835 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:57.835 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:57.835 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:57.835 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:58.094 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:58.094 "name": "BaseBdev2", 00:20:58.094 "aliases": [ 00:20:58.094 "8170c51b-c25e-41d3-8b85-ac3f5350a2a8" 00:20:58.094 ], 00:20:58.094 "product_name": "Malloc disk", 00:20:58.094 "block_size": 512, 00:20:58.094 "num_blocks": 65536, 00:20:58.094 "uuid": "8170c51b-c25e-41d3-8b85-ac3f5350a2a8", 00:20:58.094 "assigned_rate_limits": { 00:20:58.094 "rw_ios_per_sec": 0, 00:20:58.094 "rw_mbytes_per_sec": 0, 00:20:58.094 "r_mbytes_per_sec": 0, 00:20:58.094 "w_mbytes_per_sec": 0 00:20:58.094 }, 00:20:58.094 "claimed": true, 00:20:58.094 "claim_type": "exclusive_write", 00:20:58.094 "zoned": false, 00:20:58.094 "supported_io_types": { 00:20:58.094 "read": true, 00:20:58.094 "write": true, 00:20:58.094 "unmap": true, 00:20:58.094 "flush": true, 00:20:58.094 "reset": true, 00:20:58.094 "nvme_admin": false, 00:20:58.094 "nvme_io": false, 00:20:58.094 "nvme_io_md": false, 00:20:58.094 "write_zeroes": true, 00:20:58.094 "zcopy": true, 00:20:58.094 "get_zone_info": false, 00:20:58.094 "zone_management": false, 00:20:58.094 "zone_append": false, 00:20:58.094 "compare": false, 00:20:58.094 "compare_and_write": false, 00:20:58.094 "abort": true, 00:20:58.094 "seek_hole": false, 00:20:58.094 "seek_data": false, 00:20:58.094 "copy": true, 00:20:58.094 "nvme_iov_md": false 00:20:58.094 }, 00:20:58.094 "memory_domains": [ 00:20:58.094 { 00:20:58.094 "dma_device_id": "system", 00:20:58.094 "dma_device_type": 1 00:20:58.094 }, 00:20:58.094 { 00:20:58.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.094 "dma_device_type": 2 00:20:58.094 } 00:20:58.094 ], 00:20:58.094 "driver_specific": {} 00:20:58.094 }' 00:20:58.094 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:58.094 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:58.094 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:58.094 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:58.353 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:58.353 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:58.353 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:58.353 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:58.353 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:58.353 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:58.611 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:58.612 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:58.612 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:58.612 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:58.612 20:34:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:58.870 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:58.870 "name": "BaseBdev3", 00:20:58.870 "aliases": [ 00:20:58.870 "61b29014-f552-402c-82b2-a5e9c906a19f" 00:20:58.870 ], 00:20:58.870 "product_name": "Malloc disk", 00:20:58.870 "block_size": 512, 00:20:58.870 "num_blocks": 65536, 00:20:58.870 "uuid": "61b29014-f552-402c-82b2-a5e9c906a19f", 00:20:58.870 "assigned_rate_limits": { 00:20:58.870 "rw_ios_per_sec": 0, 00:20:58.870 "rw_mbytes_per_sec": 0, 00:20:58.870 "r_mbytes_per_sec": 0, 00:20:58.870 "w_mbytes_per_sec": 0 00:20:58.870 }, 00:20:58.870 "claimed": true, 00:20:58.870 "claim_type": "exclusive_write", 00:20:58.870 "zoned": false, 00:20:58.870 "supported_io_types": { 00:20:58.870 "read": true, 00:20:58.870 "write": true, 00:20:58.870 "unmap": true, 00:20:58.870 "flush": true, 00:20:58.870 "reset": true, 00:20:58.870 "nvme_admin": false, 00:20:58.870 "nvme_io": false, 00:20:58.870 "nvme_io_md": false, 00:20:58.870 "write_zeroes": true, 00:20:58.870 "zcopy": true, 00:20:58.871 "get_zone_info": false, 00:20:58.871 "zone_management": false, 00:20:58.871 "zone_append": false, 00:20:58.871 "compare": false, 00:20:58.871 "compare_and_write": false, 00:20:58.871 "abort": true, 00:20:58.871 "seek_hole": false, 00:20:58.871 "seek_data": false, 00:20:58.871 "copy": true, 00:20:58.871 "nvme_iov_md": false 00:20:58.871 }, 00:20:58.871 "memory_domains": [ 00:20:58.871 { 00:20:58.871 "dma_device_id": "system", 00:20:58.871 "dma_device_type": 1 00:20:58.871 }, 00:20:58.871 { 00:20:58.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.871 "dma_device_type": 2 00:20:58.871 } 00:20:58.871 ], 00:20:58.871 "driver_specific": {} 00:20:58.871 }' 00:20:58.871 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:58.871 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:58.871 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:58.871 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:58.871 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:59.129 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:59.129 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:59.129 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:59.129 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:59.129 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:59.129 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:59.388 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:59.388 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:59.388 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:59.388 20:34:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:59.715 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:59.715 "name": "BaseBdev4", 00:20:59.715 "aliases": [ 00:20:59.715 "87d46065-4281-493a-8c9e-229f34a7aab2" 00:20:59.715 ], 00:20:59.715 "product_name": "Malloc disk", 00:20:59.715 "block_size": 512, 00:20:59.715 "num_blocks": 65536, 00:20:59.715 "uuid": "87d46065-4281-493a-8c9e-229f34a7aab2", 00:20:59.715 "assigned_rate_limits": { 00:20:59.715 "rw_ios_per_sec": 0, 00:20:59.715 "rw_mbytes_per_sec": 0, 00:20:59.715 "r_mbytes_per_sec": 0, 00:20:59.715 "w_mbytes_per_sec": 0 00:20:59.715 }, 00:20:59.715 "claimed": true, 00:20:59.715 "claim_type": "exclusive_write", 00:20:59.715 "zoned": false, 00:20:59.715 "supported_io_types": { 00:20:59.715 "read": true, 00:20:59.715 "write": true, 00:20:59.715 "unmap": true, 00:20:59.715 "flush": true, 00:20:59.715 "reset": true, 00:20:59.715 "nvme_admin": false, 00:20:59.715 "nvme_io": false, 00:20:59.715 "nvme_io_md": false, 00:20:59.715 "write_zeroes": true, 00:20:59.715 "zcopy": true, 00:20:59.715 "get_zone_info": false, 00:20:59.715 "zone_management": false, 00:20:59.715 "zone_append": false, 00:20:59.715 "compare": false, 00:20:59.715 "compare_and_write": false, 00:20:59.715 "abort": true, 00:20:59.715 "seek_hole": false, 00:20:59.715 "seek_data": false, 00:20:59.715 "copy": true, 00:20:59.715 "nvme_iov_md": false 00:20:59.715 }, 00:20:59.715 "memory_domains": [ 00:20:59.715 { 00:20:59.715 "dma_device_id": "system", 00:20:59.715 "dma_device_type": 1 00:20:59.715 }, 00:20:59.715 { 00:20:59.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:59.715 "dma_device_type": 2 00:20:59.715 } 00:20:59.715 ], 00:20:59.715 "driver_specific": {} 00:20:59.715 }' 00:20:59.715 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:59.974 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:59.974 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:59.974 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:59.974 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:00.233 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:00.233 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:00.233 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:00.233 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:00.233 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:00.233 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:00.233 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:00.233 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:00.494 [2024-07-15 20:34:52.867863] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:00.494 [2024-07-15 20:34:52.867898] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:00.494 [2024-07-15 20:34:52.867961] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.753 20:34:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:01.321 20:34:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.321 "name": "Existed_Raid", 00:21:01.321 "uuid": "06cc5dae-185f-4539-bfc7-195309dd9058", 00:21:01.321 "strip_size_kb": 64, 00:21:01.321 "state": "offline", 00:21:01.321 "raid_level": "concat", 00:21:01.321 "superblock": true, 00:21:01.321 "num_base_bdevs": 4, 00:21:01.321 "num_base_bdevs_discovered": 3, 00:21:01.321 "num_base_bdevs_operational": 3, 00:21:01.321 "base_bdevs_list": [ 00:21:01.321 { 00:21:01.321 "name": null, 00:21:01.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.321 "is_configured": false, 00:21:01.321 "data_offset": 2048, 00:21:01.321 "data_size": 63488 00:21:01.321 }, 00:21:01.321 { 00:21:01.321 "name": "BaseBdev2", 00:21:01.321 "uuid": "8170c51b-c25e-41d3-8b85-ac3f5350a2a8", 00:21:01.321 "is_configured": true, 00:21:01.321 "data_offset": 2048, 00:21:01.321 "data_size": 63488 00:21:01.321 }, 00:21:01.321 { 00:21:01.321 "name": "BaseBdev3", 00:21:01.321 "uuid": "61b29014-f552-402c-82b2-a5e9c906a19f", 00:21:01.321 "is_configured": true, 00:21:01.321 "data_offset": 2048, 00:21:01.321 "data_size": 63488 00:21:01.322 }, 00:21:01.322 { 00:21:01.322 "name": "BaseBdev4", 00:21:01.322 "uuid": "87d46065-4281-493a-8c9e-229f34a7aab2", 00:21:01.322 "is_configured": true, 00:21:01.322 "data_offset": 2048, 00:21:01.322 "data_size": 63488 00:21:01.322 } 00:21:01.322 ] 00:21:01.322 }' 00:21:01.322 20:34:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.322 20:34:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:01.890 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:01.890 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:01.890 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:01.890 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.890 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:01.890 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:01.890 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:02.459 [2024-07-15 20:34:54.738794] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:02.459 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:02.459 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:02.459 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.459 20:34:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:02.718 20:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:02.718 20:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:02.718 20:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:03.284 [2024-07-15 20:34:55.505295] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:03.284 20:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:03.284 20:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:03.284 20:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.284 20:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:03.541 20:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:03.541 20:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:03.542 20:34:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:04.107 [2024-07-15 20:34:56.274381] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:04.107 [2024-07-15 20:34:56.274428] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10b7350 name Existed_Raid, state offline 00:21:04.107 20:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:04.107 20:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:04.107 20:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.107 20:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:04.674 20:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:04.674 20:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:04.674 20:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:04.674 20:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:04.674 20:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:04.674 20:34:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:04.933 BaseBdev2 00:21:04.933 20:34:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:04.933 20:34:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:04.933 20:34:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:04.933 20:34:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:04.933 20:34:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:04.933 20:34:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:04.933 20:34:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:05.500 20:34:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:05.758 [ 00:21:05.758 { 00:21:05.758 "name": "BaseBdev2", 00:21:05.758 "aliases": [ 00:21:05.758 "21457fc9-265a-4e87-8f5e-5a0a17f80ee9" 00:21:05.758 ], 00:21:05.758 "product_name": "Malloc disk", 00:21:05.758 "block_size": 512, 00:21:05.758 "num_blocks": 65536, 00:21:05.758 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:05.758 "assigned_rate_limits": { 00:21:05.758 "rw_ios_per_sec": 0, 00:21:05.758 "rw_mbytes_per_sec": 0, 00:21:05.758 "r_mbytes_per_sec": 0, 00:21:05.758 "w_mbytes_per_sec": 0 00:21:05.758 }, 00:21:05.758 "claimed": false, 00:21:05.758 "zoned": false, 00:21:05.758 "supported_io_types": { 00:21:05.758 "read": true, 00:21:05.758 "write": true, 00:21:05.758 "unmap": true, 00:21:05.758 "flush": true, 00:21:05.758 "reset": true, 00:21:05.758 "nvme_admin": false, 00:21:05.758 "nvme_io": false, 00:21:05.758 "nvme_io_md": false, 00:21:05.758 "write_zeroes": true, 00:21:05.758 "zcopy": true, 00:21:05.758 "get_zone_info": false, 00:21:05.758 "zone_management": false, 00:21:05.758 "zone_append": false, 00:21:05.758 "compare": false, 00:21:05.758 "compare_and_write": false, 00:21:05.758 "abort": true, 00:21:05.758 "seek_hole": false, 00:21:05.758 "seek_data": false, 00:21:05.758 "copy": true, 00:21:05.758 "nvme_iov_md": false 00:21:05.758 }, 00:21:05.758 "memory_domains": [ 00:21:05.758 { 00:21:05.758 "dma_device_id": "system", 00:21:05.758 "dma_device_type": 1 00:21:05.758 }, 00:21:05.758 { 00:21:05.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.758 "dma_device_type": 2 00:21:05.758 } 00:21:05.758 ], 00:21:05.758 "driver_specific": {} 00:21:05.758 } 00:21:05.758 ] 00:21:05.758 20:34:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:05.758 20:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:05.758 20:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:05.758 20:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:06.324 BaseBdev3 00:21:06.324 20:34:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:06.324 20:34:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:06.324 20:34:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:06.324 20:34:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:06.324 20:34:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:06.324 20:34:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:06.324 20:34:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:06.891 20:34:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:07.458 [ 00:21:07.458 { 00:21:07.458 "name": "BaseBdev3", 00:21:07.458 "aliases": [ 00:21:07.458 "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c" 00:21:07.458 ], 00:21:07.458 "product_name": "Malloc disk", 00:21:07.458 "block_size": 512, 00:21:07.458 "num_blocks": 65536, 00:21:07.458 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:07.458 "assigned_rate_limits": { 00:21:07.458 "rw_ios_per_sec": 0, 00:21:07.458 "rw_mbytes_per_sec": 0, 00:21:07.458 "r_mbytes_per_sec": 0, 00:21:07.458 "w_mbytes_per_sec": 0 00:21:07.458 }, 00:21:07.458 "claimed": false, 00:21:07.458 "zoned": false, 00:21:07.458 "supported_io_types": { 00:21:07.458 "read": true, 00:21:07.458 "write": true, 00:21:07.458 "unmap": true, 00:21:07.458 "flush": true, 00:21:07.458 "reset": true, 00:21:07.458 "nvme_admin": false, 00:21:07.458 "nvme_io": false, 00:21:07.458 "nvme_io_md": false, 00:21:07.458 "write_zeroes": true, 00:21:07.458 "zcopy": true, 00:21:07.458 "get_zone_info": false, 00:21:07.458 "zone_management": false, 00:21:07.458 "zone_append": false, 00:21:07.458 "compare": false, 00:21:07.458 "compare_and_write": false, 00:21:07.458 "abort": true, 00:21:07.458 "seek_hole": false, 00:21:07.458 "seek_data": false, 00:21:07.458 "copy": true, 00:21:07.458 "nvme_iov_md": false 00:21:07.458 }, 00:21:07.458 "memory_domains": [ 00:21:07.458 { 00:21:07.458 "dma_device_id": "system", 00:21:07.458 "dma_device_type": 1 00:21:07.458 }, 00:21:07.458 { 00:21:07.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.458 "dma_device_type": 2 00:21:07.458 } 00:21:07.458 ], 00:21:07.458 "driver_specific": {} 00:21:07.458 } 00:21:07.458 ] 00:21:07.458 20:34:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:07.458 20:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:07.458 20:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:07.458 20:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:07.717 BaseBdev4 00:21:07.717 20:34:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:07.717 20:34:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:07.717 20:34:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:07.717 20:34:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:07.717 20:34:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:07.717 20:34:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:07.717 20:34:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:07.976 20:35:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:08.234 [ 00:21:08.234 { 00:21:08.234 "name": "BaseBdev4", 00:21:08.234 "aliases": [ 00:21:08.234 "9b7395b4-c606-4935-8fb9-ae16acdf20f0" 00:21:08.234 ], 00:21:08.234 "product_name": "Malloc disk", 00:21:08.234 "block_size": 512, 00:21:08.234 "num_blocks": 65536, 00:21:08.234 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:08.234 "assigned_rate_limits": { 00:21:08.234 "rw_ios_per_sec": 0, 00:21:08.234 "rw_mbytes_per_sec": 0, 00:21:08.234 "r_mbytes_per_sec": 0, 00:21:08.234 "w_mbytes_per_sec": 0 00:21:08.234 }, 00:21:08.234 "claimed": false, 00:21:08.234 "zoned": false, 00:21:08.234 "supported_io_types": { 00:21:08.234 "read": true, 00:21:08.234 "write": true, 00:21:08.234 "unmap": true, 00:21:08.234 "flush": true, 00:21:08.234 "reset": true, 00:21:08.234 "nvme_admin": false, 00:21:08.234 "nvme_io": false, 00:21:08.234 "nvme_io_md": false, 00:21:08.234 "write_zeroes": true, 00:21:08.234 "zcopy": true, 00:21:08.234 "get_zone_info": false, 00:21:08.234 "zone_management": false, 00:21:08.234 "zone_append": false, 00:21:08.234 "compare": false, 00:21:08.234 "compare_and_write": false, 00:21:08.234 "abort": true, 00:21:08.234 "seek_hole": false, 00:21:08.234 "seek_data": false, 00:21:08.234 "copy": true, 00:21:08.234 "nvme_iov_md": false 00:21:08.234 }, 00:21:08.234 "memory_domains": [ 00:21:08.234 { 00:21:08.234 "dma_device_id": "system", 00:21:08.234 "dma_device_type": 1 00:21:08.234 }, 00:21:08.234 { 00:21:08.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.234 "dma_device_type": 2 00:21:08.234 } 00:21:08.234 ], 00:21:08.234 "driver_specific": {} 00:21:08.234 } 00:21:08.234 ] 00:21:08.234 20:35:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:08.234 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:08.234 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:08.234 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:08.492 [2024-07-15 20:35:00.779716] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:08.493 [2024-07-15 20:35:00.779761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:08.493 [2024-07-15 20:35:00.779782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:08.493 [2024-07-15 20:35:00.781175] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:08.493 [2024-07-15 20:35:00.781220] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.493 20:35:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:08.751 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:08.751 "name": "Existed_Raid", 00:21:08.751 "uuid": "eb2b80b7-44d1-466b-8111-d30b041c543e", 00:21:08.751 "strip_size_kb": 64, 00:21:08.751 "state": "configuring", 00:21:08.751 "raid_level": "concat", 00:21:08.751 "superblock": true, 00:21:08.751 "num_base_bdevs": 4, 00:21:08.751 "num_base_bdevs_discovered": 3, 00:21:08.751 "num_base_bdevs_operational": 4, 00:21:08.751 "base_bdevs_list": [ 00:21:08.751 { 00:21:08.751 "name": "BaseBdev1", 00:21:08.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:08.751 "is_configured": false, 00:21:08.751 "data_offset": 0, 00:21:08.751 "data_size": 0 00:21:08.751 }, 00:21:08.751 { 00:21:08.751 "name": "BaseBdev2", 00:21:08.751 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:08.751 "is_configured": true, 00:21:08.751 "data_offset": 2048, 00:21:08.751 "data_size": 63488 00:21:08.751 }, 00:21:08.751 { 00:21:08.751 "name": "BaseBdev3", 00:21:08.751 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:08.751 "is_configured": true, 00:21:08.751 "data_offset": 2048, 00:21:08.751 "data_size": 63488 00:21:08.751 }, 00:21:08.751 { 00:21:08.751 "name": "BaseBdev4", 00:21:08.751 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:08.751 "is_configured": true, 00:21:08.751 "data_offset": 2048, 00:21:08.751 "data_size": 63488 00:21:08.751 } 00:21:08.751 ] 00:21:08.751 }' 00:21:08.751 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:08.751 20:35:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:09.318 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:09.577 [2024-07-15 20:35:01.930832] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:09.577 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:09.577 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:09.577 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:09.577 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:09.577 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:09.577 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:09.577 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:09.577 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:09.577 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:09.577 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:09.839 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.839 20:35:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:09.839 20:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:09.839 "name": "Existed_Raid", 00:21:09.839 "uuid": "eb2b80b7-44d1-466b-8111-d30b041c543e", 00:21:09.839 "strip_size_kb": 64, 00:21:09.839 "state": "configuring", 00:21:09.839 "raid_level": "concat", 00:21:09.839 "superblock": true, 00:21:09.839 "num_base_bdevs": 4, 00:21:09.839 "num_base_bdevs_discovered": 2, 00:21:09.839 "num_base_bdevs_operational": 4, 00:21:09.839 "base_bdevs_list": [ 00:21:09.839 { 00:21:09.839 "name": "BaseBdev1", 00:21:09.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:09.839 "is_configured": false, 00:21:09.839 "data_offset": 0, 00:21:09.839 "data_size": 0 00:21:09.839 }, 00:21:09.839 { 00:21:09.839 "name": null, 00:21:09.839 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:09.839 "is_configured": false, 00:21:09.839 "data_offset": 2048, 00:21:09.839 "data_size": 63488 00:21:09.839 }, 00:21:09.839 { 00:21:09.839 "name": "BaseBdev3", 00:21:09.839 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:09.839 "is_configured": true, 00:21:09.839 "data_offset": 2048, 00:21:09.839 "data_size": 63488 00:21:09.839 }, 00:21:09.839 { 00:21:09.839 "name": "BaseBdev4", 00:21:09.839 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:09.839 "is_configured": true, 00:21:09.839 "data_offset": 2048, 00:21:09.839 "data_size": 63488 00:21:09.839 } 00:21:09.839 ] 00:21:09.839 }' 00:21:09.839 20:35:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:09.839 20:35:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:10.779 20:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.779 20:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:11.038 20:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:11.038 20:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:11.298 [2024-07-15 20:35:03.586647] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:11.298 BaseBdev1 00:21:11.298 20:35:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:11.298 20:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:11.298 20:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:11.298 20:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:11.298 20:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:11.298 20:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:11.298 20:35:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:11.866 20:35:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:12.127 [ 00:21:12.127 { 00:21:12.127 "name": "BaseBdev1", 00:21:12.127 "aliases": [ 00:21:12.127 "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4" 00:21:12.127 ], 00:21:12.127 "product_name": "Malloc disk", 00:21:12.127 "block_size": 512, 00:21:12.127 "num_blocks": 65536, 00:21:12.127 "uuid": "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4", 00:21:12.127 "assigned_rate_limits": { 00:21:12.127 "rw_ios_per_sec": 0, 00:21:12.127 "rw_mbytes_per_sec": 0, 00:21:12.127 "r_mbytes_per_sec": 0, 00:21:12.127 "w_mbytes_per_sec": 0 00:21:12.127 }, 00:21:12.127 "claimed": true, 00:21:12.127 "claim_type": "exclusive_write", 00:21:12.127 "zoned": false, 00:21:12.127 "supported_io_types": { 00:21:12.127 "read": true, 00:21:12.127 "write": true, 00:21:12.127 "unmap": true, 00:21:12.127 "flush": true, 00:21:12.127 "reset": true, 00:21:12.127 "nvme_admin": false, 00:21:12.127 "nvme_io": false, 00:21:12.127 "nvme_io_md": false, 00:21:12.127 "write_zeroes": true, 00:21:12.127 "zcopy": true, 00:21:12.127 "get_zone_info": false, 00:21:12.127 "zone_management": false, 00:21:12.127 "zone_append": false, 00:21:12.127 "compare": false, 00:21:12.127 "compare_and_write": false, 00:21:12.127 "abort": true, 00:21:12.127 "seek_hole": false, 00:21:12.127 "seek_data": false, 00:21:12.127 "copy": true, 00:21:12.127 "nvme_iov_md": false 00:21:12.127 }, 00:21:12.127 "memory_domains": [ 00:21:12.127 { 00:21:12.127 "dma_device_id": "system", 00:21:12.127 "dma_device_type": 1 00:21:12.127 }, 00:21:12.127 { 00:21:12.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.127 "dma_device_type": 2 00:21:12.127 } 00:21:12.127 ], 00:21:12.127 "driver_specific": {} 00:21:12.127 } 00:21:12.127 ] 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.127 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:12.696 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.696 "name": "Existed_Raid", 00:21:12.696 "uuid": "eb2b80b7-44d1-466b-8111-d30b041c543e", 00:21:12.696 "strip_size_kb": 64, 00:21:12.696 "state": "configuring", 00:21:12.696 "raid_level": "concat", 00:21:12.696 "superblock": true, 00:21:12.696 "num_base_bdevs": 4, 00:21:12.696 "num_base_bdevs_discovered": 3, 00:21:12.696 "num_base_bdevs_operational": 4, 00:21:12.696 "base_bdevs_list": [ 00:21:12.696 { 00:21:12.696 "name": "BaseBdev1", 00:21:12.696 "uuid": "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4", 00:21:12.696 "is_configured": true, 00:21:12.696 "data_offset": 2048, 00:21:12.696 "data_size": 63488 00:21:12.696 }, 00:21:12.696 { 00:21:12.696 "name": null, 00:21:12.696 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:12.696 "is_configured": false, 00:21:12.696 "data_offset": 2048, 00:21:12.696 "data_size": 63488 00:21:12.696 }, 00:21:12.696 { 00:21:12.696 "name": "BaseBdev3", 00:21:12.696 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:12.696 "is_configured": true, 00:21:12.696 "data_offset": 2048, 00:21:12.696 "data_size": 63488 00:21:12.696 }, 00:21:12.696 { 00:21:12.696 "name": "BaseBdev4", 00:21:12.696 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:12.696 "is_configured": true, 00:21:12.696 "data_offset": 2048, 00:21:12.696 "data_size": 63488 00:21:12.696 } 00:21:12.696 ] 00:21:12.696 }' 00:21:12.696 20:35:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.696 20:35:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:13.264 20:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.264 20:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:13.523 20:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:13.523 20:35:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:13.782 [2024-07-15 20:35:06.005321] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.782 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:14.041 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.041 "name": "Existed_Raid", 00:21:14.041 "uuid": "eb2b80b7-44d1-466b-8111-d30b041c543e", 00:21:14.041 "strip_size_kb": 64, 00:21:14.041 "state": "configuring", 00:21:14.041 "raid_level": "concat", 00:21:14.041 "superblock": true, 00:21:14.041 "num_base_bdevs": 4, 00:21:14.041 "num_base_bdevs_discovered": 2, 00:21:14.041 "num_base_bdevs_operational": 4, 00:21:14.041 "base_bdevs_list": [ 00:21:14.041 { 00:21:14.041 "name": "BaseBdev1", 00:21:14.041 "uuid": "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4", 00:21:14.041 "is_configured": true, 00:21:14.041 "data_offset": 2048, 00:21:14.041 "data_size": 63488 00:21:14.041 }, 00:21:14.041 { 00:21:14.041 "name": null, 00:21:14.041 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:14.041 "is_configured": false, 00:21:14.041 "data_offset": 2048, 00:21:14.041 "data_size": 63488 00:21:14.041 }, 00:21:14.041 { 00:21:14.041 "name": null, 00:21:14.041 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:14.041 "is_configured": false, 00:21:14.041 "data_offset": 2048, 00:21:14.041 "data_size": 63488 00:21:14.041 }, 00:21:14.041 { 00:21:14.041 "name": "BaseBdev4", 00:21:14.041 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:14.041 "is_configured": true, 00:21:14.041 "data_offset": 2048, 00:21:14.041 "data_size": 63488 00:21:14.041 } 00:21:14.041 ] 00:21:14.041 }' 00:21:14.041 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.041 20:35:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:14.617 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.617 20:35:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:14.877 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:14.877 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:14.877 [2024-07-15 20:35:07.240619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:15.136 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:15.136 "name": "Existed_Raid", 00:21:15.136 "uuid": "eb2b80b7-44d1-466b-8111-d30b041c543e", 00:21:15.136 "strip_size_kb": 64, 00:21:15.136 "state": "configuring", 00:21:15.136 "raid_level": "concat", 00:21:15.136 "superblock": true, 00:21:15.136 "num_base_bdevs": 4, 00:21:15.136 "num_base_bdevs_discovered": 3, 00:21:15.136 "num_base_bdevs_operational": 4, 00:21:15.136 "base_bdevs_list": [ 00:21:15.136 { 00:21:15.136 "name": "BaseBdev1", 00:21:15.136 "uuid": "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4", 00:21:15.136 "is_configured": true, 00:21:15.136 "data_offset": 2048, 00:21:15.136 "data_size": 63488 00:21:15.136 }, 00:21:15.136 { 00:21:15.136 "name": null, 00:21:15.136 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:15.136 "is_configured": false, 00:21:15.136 "data_offset": 2048, 00:21:15.136 "data_size": 63488 00:21:15.136 }, 00:21:15.136 { 00:21:15.136 "name": "BaseBdev3", 00:21:15.136 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:15.136 "is_configured": true, 00:21:15.137 "data_offset": 2048, 00:21:15.137 "data_size": 63488 00:21:15.137 }, 00:21:15.137 { 00:21:15.137 "name": "BaseBdev4", 00:21:15.137 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:15.137 "is_configured": true, 00:21:15.137 "data_offset": 2048, 00:21:15.137 "data_size": 63488 00:21:15.137 } 00:21:15.137 ] 00:21:15.137 }' 00:21:15.137 20:35:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:15.137 20:35:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:16.075 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.075 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:16.075 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:16.075 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:16.643 [2024-07-15 20:35:08.836878] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:16.643 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:16.643 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:16.643 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:16.643 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:16.643 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:16.643 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:16.643 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.644 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.644 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.644 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.644 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.644 20:35:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:16.903 20:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.903 "name": "Existed_Raid", 00:21:16.903 "uuid": "eb2b80b7-44d1-466b-8111-d30b041c543e", 00:21:16.903 "strip_size_kb": 64, 00:21:16.903 "state": "configuring", 00:21:16.903 "raid_level": "concat", 00:21:16.903 "superblock": true, 00:21:16.903 "num_base_bdevs": 4, 00:21:16.903 "num_base_bdevs_discovered": 2, 00:21:16.903 "num_base_bdevs_operational": 4, 00:21:16.903 "base_bdevs_list": [ 00:21:16.903 { 00:21:16.903 "name": null, 00:21:16.903 "uuid": "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4", 00:21:16.903 "is_configured": false, 00:21:16.903 "data_offset": 2048, 00:21:16.903 "data_size": 63488 00:21:16.903 }, 00:21:16.903 { 00:21:16.903 "name": null, 00:21:16.903 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:16.903 "is_configured": false, 00:21:16.903 "data_offset": 2048, 00:21:16.903 "data_size": 63488 00:21:16.903 }, 00:21:16.903 { 00:21:16.903 "name": "BaseBdev3", 00:21:16.903 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:16.903 "is_configured": true, 00:21:16.903 "data_offset": 2048, 00:21:16.903 "data_size": 63488 00:21:16.903 }, 00:21:16.903 { 00:21:16.903 "name": "BaseBdev4", 00:21:16.903 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:16.903 "is_configured": true, 00:21:16.903 "data_offset": 2048, 00:21:16.903 "data_size": 63488 00:21:16.903 } 00:21:16.903 ] 00:21:16.903 }' 00:21:16.903 20:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.903 20:35:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:17.471 20:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.471 20:35:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:17.729 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:17.730 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:17.989 [2024-07-15 20:35:10.232982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:17.989 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.248 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.248 "name": "Existed_Raid", 00:21:18.248 "uuid": "eb2b80b7-44d1-466b-8111-d30b041c543e", 00:21:18.248 "strip_size_kb": 64, 00:21:18.248 "state": "configuring", 00:21:18.248 "raid_level": "concat", 00:21:18.248 "superblock": true, 00:21:18.248 "num_base_bdevs": 4, 00:21:18.248 "num_base_bdevs_discovered": 3, 00:21:18.248 "num_base_bdevs_operational": 4, 00:21:18.248 "base_bdevs_list": [ 00:21:18.248 { 00:21:18.248 "name": null, 00:21:18.248 "uuid": "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4", 00:21:18.248 "is_configured": false, 00:21:18.248 "data_offset": 2048, 00:21:18.248 "data_size": 63488 00:21:18.248 }, 00:21:18.248 { 00:21:18.248 "name": "BaseBdev2", 00:21:18.248 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:18.248 "is_configured": true, 00:21:18.248 "data_offset": 2048, 00:21:18.248 "data_size": 63488 00:21:18.248 }, 00:21:18.248 { 00:21:18.248 "name": "BaseBdev3", 00:21:18.248 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:18.248 "is_configured": true, 00:21:18.248 "data_offset": 2048, 00:21:18.248 "data_size": 63488 00:21:18.248 }, 00:21:18.248 { 00:21:18.248 "name": "BaseBdev4", 00:21:18.248 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:18.248 "is_configured": true, 00:21:18.248 "data_offset": 2048, 00:21:18.248 "data_size": 63488 00:21:18.248 } 00:21:18.248 ] 00:21:18.248 }' 00:21:18.248 20:35:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.248 20:35:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:18.815 20:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:18.815 20:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.074 20:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:19.074 20:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.074 20:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:19.074 20:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4 00:21:19.333 [2024-07-15 20:35:11.608077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:19.333 [2024-07-15 20:35:11.608243] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10b9850 00:21:19.333 [2024-07-15 20:35:11.608257] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:19.333 [2024-07-15 20:35:11.608433] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10afd80 00:21:19.333 [2024-07-15 20:35:11.608549] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10b9850 00:21:19.333 [2024-07-15 20:35:11.608559] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10b9850 00:21:19.333 [2024-07-15 20:35:11.608648] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:19.333 NewBaseBdev 00:21:19.333 20:35:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:19.333 20:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:19.333 20:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:19.333 20:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:19.333 20:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:19.333 20:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:19.333 20:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:19.592 20:35:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:19.851 [ 00:21:19.851 { 00:21:19.851 "name": "NewBaseBdev", 00:21:19.851 "aliases": [ 00:21:19.851 "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4" 00:21:19.851 ], 00:21:19.851 "product_name": "Malloc disk", 00:21:19.851 "block_size": 512, 00:21:19.851 "num_blocks": 65536, 00:21:19.851 "uuid": "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4", 00:21:19.851 "assigned_rate_limits": { 00:21:19.851 "rw_ios_per_sec": 0, 00:21:19.851 "rw_mbytes_per_sec": 0, 00:21:19.851 "r_mbytes_per_sec": 0, 00:21:19.851 "w_mbytes_per_sec": 0 00:21:19.851 }, 00:21:19.851 "claimed": true, 00:21:19.851 "claim_type": "exclusive_write", 00:21:19.851 "zoned": false, 00:21:19.851 "supported_io_types": { 00:21:19.851 "read": true, 00:21:19.851 "write": true, 00:21:19.851 "unmap": true, 00:21:19.851 "flush": true, 00:21:19.851 "reset": true, 00:21:19.851 "nvme_admin": false, 00:21:19.851 "nvme_io": false, 00:21:19.851 "nvme_io_md": false, 00:21:19.851 "write_zeroes": true, 00:21:19.851 "zcopy": true, 00:21:19.851 "get_zone_info": false, 00:21:19.851 "zone_management": false, 00:21:19.851 "zone_append": false, 00:21:19.851 "compare": false, 00:21:19.851 "compare_and_write": false, 00:21:19.851 "abort": true, 00:21:19.851 "seek_hole": false, 00:21:19.851 "seek_data": false, 00:21:19.851 "copy": true, 00:21:19.851 "nvme_iov_md": false 00:21:19.851 }, 00:21:19.851 "memory_domains": [ 00:21:19.851 { 00:21:19.851 "dma_device_id": "system", 00:21:19.851 "dma_device_type": 1 00:21:19.851 }, 00:21:19.851 { 00:21:19.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.851 "dma_device_type": 2 00:21:19.851 } 00:21:19.851 ], 00:21:19.851 "driver_specific": {} 00:21:19.851 } 00:21:19.851 ] 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.851 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:20.110 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.111 "name": "Existed_Raid", 00:21:20.111 "uuid": "eb2b80b7-44d1-466b-8111-d30b041c543e", 00:21:20.111 "strip_size_kb": 64, 00:21:20.111 "state": "online", 00:21:20.111 "raid_level": "concat", 00:21:20.111 "superblock": true, 00:21:20.111 "num_base_bdevs": 4, 00:21:20.111 "num_base_bdevs_discovered": 4, 00:21:20.111 "num_base_bdevs_operational": 4, 00:21:20.111 "base_bdevs_list": [ 00:21:20.111 { 00:21:20.111 "name": "NewBaseBdev", 00:21:20.111 "uuid": "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4", 00:21:20.111 "is_configured": true, 00:21:20.111 "data_offset": 2048, 00:21:20.111 "data_size": 63488 00:21:20.111 }, 00:21:20.111 { 00:21:20.111 "name": "BaseBdev2", 00:21:20.111 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:20.111 "is_configured": true, 00:21:20.111 "data_offset": 2048, 00:21:20.111 "data_size": 63488 00:21:20.111 }, 00:21:20.111 { 00:21:20.111 "name": "BaseBdev3", 00:21:20.111 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:20.111 "is_configured": true, 00:21:20.111 "data_offset": 2048, 00:21:20.111 "data_size": 63488 00:21:20.111 }, 00:21:20.111 { 00:21:20.111 "name": "BaseBdev4", 00:21:20.111 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:20.111 "is_configured": true, 00:21:20.111 "data_offset": 2048, 00:21:20.111 "data_size": 63488 00:21:20.111 } 00:21:20.111 ] 00:21:20.111 }' 00:21:20.111 20:35:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.111 20:35:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:21.047 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:21.047 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:21.047 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:21.047 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:21.047 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:21.047 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:21.047 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:21.047 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:21.047 [2024-07-15 20:35:13.421217] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:21.306 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:21.306 "name": "Existed_Raid", 00:21:21.306 "aliases": [ 00:21:21.306 "eb2b80b7-44d1-466b-8111-d30b041c543e" 00:21:21.306 ], 00:21:21.306 "product_name": "Raid Volume", 00:21:21.306 "block_size": 512, 00:21:21.306 "num_blocks": 253952, 00:21:21.306 "uuid": "eb2b80b7-44d1-466b-8111-d30b041c543e", 00:21:21.306 "assigned_rate_limits": { 00:21:21.306 "rw_ios_per_sec": 0, 00:21:21.306 "rw_mbytes_per_sec": 0, 00:21:21.306 "r_mbytes_per_sec": 0, 00:21:21.306 "w_mbytes_per_sec": 0 00:21:21.306 }, 00:21:21.306 "claimed": false, 00:21:21.306 "zoned": false, 00:21:21.306 "supported_io_types": { 00:21:21.306 "read": true, 00:21:21.306 "write": true, 00:21:21.306 "unmap": true, 00:21:21.306 "flush": true, 00:21:21.306 "reset": true, 00:21:21.306 "nvme_admin": false, 00:21:21.306 "nvme_io": false, 00:21:21.306 "nvme_io_md": false, 00:21:21.306 "write_zeroes": true, 00:21:21.306 "zcopy": false, 00:21:21.306 "get_zone_info": false, 00:21:21.306 "zone_management": false, 00:21:21.306 "zone_append": false, 00:21:21.306 "compare": false, 00:21:21.306 "compare_and_write": false, 00:21:21.306 "abort": false, 00:21:21.306 "seek_hole": false, 00:21:21.306 "seek_data": false, 00:21:21.306 "copy": false, 00:21:21.306 "nvme_iov_md": false 00:21:21.306 }, 00:21:21.306 "memory_domains": [ 00:21:21.306 { 00:21:21.306 "dma_device_id": "system", 00:21:21.306 "dma_device_type": 1 00:21:21.306 }, 00:21:21.306 { 00:21:21.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.306 "dma_device_type": 2 00:21:21.306 }, 00:21:21.306 { 00:21:21.306 "dma_device_id": "system", 00:21:21.306 "dma_device_type": 1 00:21:21.306 }, 00:21:21.306 { 00:21:21.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.306 "dma_device_type": 2 00:21:21.306 }, 00:21:21.306 { 00:21:21.306 "dma_device_id": "system", 00:21:21.306 "dma_device_type": 1 00:21:21.306 }, 00:21:21.306 { 00:21:21.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.306 "dma_device_type": 2 00:21:21.306 }, 00:21:21.306 { 00:21:21.306 "dma_device_id": "system", 00:21:21.306 "dma_device_type": 1 00:21:21.306 }, 00:21:21.306 { 00:21:21.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.306 "dma_device_type": 2 00:21:21.306 } 00:21:21.306 ], 00:21:21.306 "driver_specific": { 00:21:21.306 "raid": { 00:21:21.306 "uuid": "eb2b80b7-44d1-466b-8111-d30b041c543e", 00:21:21.306 "strip_size_kb": 64, 00:21:21.306 "state": "online", 00:21:21.306 "raid_level": "concat", 00:21:21.306 "superblock": true, 00:21:21.306 "num_base_bdevs": 4, 00:21:21.306 "num_base_bdevs_discovered": 4, 00:21:21.306 "num_base_bdevs_operational": 4, 00:21:21.306 "base_bdevs_list": [ 00:21:21.306 { 00:21:21.306 "name": "NewBaseBdev", 00:21:21.306 "uuid": "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4", 00:21:21.306 "is_configured": true, 00:21:21.306 "data_offset": 2048, 00:21:21.306 "data_size": 63488 00:21:21.306 }, 00:21:21.306 { 00:21:21.306 "name": "BaseBdev2", 00:21:21.306 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:21.306 "is_configured": true, 00:21:21.306 "data_offset": 2048, 00:21:21.306 "data_size": 63488 00:21:21.306 }, 00:21:21.306 { 00:21:21.306 "name": "BaseBdev3", 00:21:21.306 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:21.306 "is_configured": true, 00:21:21.306 "data_offset": 2048, 00:21:21.306 "data_size": 63488 00:21:21.306 }, 00:21:21.306 { 00:21:21.306 "name": "BaseBdev4", 00:21:21.306 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:21.306 "is_configured": true, 00:21:21.306 "data_offset": 2048, 00:21:21.306 "data_size": 63488 00:21:21.306 } 00:21:21.306 ] 00:21:21.306 } 00:21:21.306 } 00:21:21.306 }' 00:21:21.306 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:21.306 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:21.306 BaseBdev2 00:21:21.306 BaseBdev3 00:21:21.306 BaseBdev4' 00:21:21.306 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.306 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:21.306 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:21.565 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:21.565 "name": "NewBaseBdev", 00:21:21.565 "aliases": [ 00:21:21.565 "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4" 00:21:21.565 ], 00:21:21.565 "product_name": "Malloc disk", 00:21:21.565 "block_size": 512, 00:21:21.565 "num_blocks": 65536, 00:21:21.565 "uuid": "51b5ddef-6a62-42ba-9fd3-8b52ce24cbd4", 00:21:21.565 "assigned_rate_limits": { 00:21:21.565 "rw_ios_per_sec": 0, 00:21:21.565 "rw_mbytes_per_sec": 0, 00:21:21.565 "r_mbytes_per_sec": 0, 00:21:21.565 "w_mbytes_per_sec": 0 00:21:21.565 }, 00:21:21.565 "claimed": true, 00:21:21.565 "claim_type": "exclusive_write", 00:21:21.565 "zoned": false, 00:21:21.565 "supported_io_types": { 00:21:21.565 "read": true, 00:21:21.565 "write": true, 00:21:21.565 "unmap": true, 00:21:21.565 "flush": true, 00:21:21.565 "reset": true, 00:21:21.565 "nvme_admin": false, 00:21:21.565 "nvme_io": false, 00:21:21.565 "nvme_io_md": false, 00:21:21.565 "write_zeroes": true, 00:21:21.565 "zcopy": true, 00:21:21.565 "get_zone_info": false, 00:21:21.565 "zone_management": false, 00:21:21.565 "zone_append": false, 00:21:21.565 "compare": false, 00:21:21.565 "compare_and_write": false, 00:21:21.565 "abort": true, 00:21:21.565 "seek_hole": false, 00:21:21.565 "seek_data": false, 00:21:21.565 "copy": true, 00:21:21.565 "nvme_iov_md": false 00:21:21.565 }, 00:21:21.565 "memory_domains": [ 00:21:21.565 { 00:21:21.565 "dma_device_id": "system", 00:21:21.565 "dma_device_type": 1 00:21:21.565 }, 00:21:21.565 { 00:21:21.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.565 "dma_device_type": 2 00:21:21.565 } 00:21:21.565 ], 00:21:21.565 "driver_specific": {} 00:21:21.565 }' 00:21:21.565 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.565 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.565 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:21.565 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.565 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.565 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:21.565 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.823 20:35:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.823 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:21.823 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.823 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.823 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.823 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.823 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:21.823 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:22.081 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:22.081 "name": "BaseBdev2", 00:21:22.081 "aliases": [ 00:21:22.081 "21457fc9-265a-4e87-8f5e-5a0a17f80ee9" 00:21:22.081 ], 00:21:22.081 "product_name": "Malloc disk", 00:21:22.081 "block_size": 512, 00:21:22.081 "num_blocks": 65536, 00:21:22.081 "uuid": "21457fc9-265a-4e87-8f5e-5a0a17f80ee9", 00:21:22.081 "assigned_rate_limits": { 00:21:22.081 "rw_ios_per_sec": 0, 00:21:22.081 "rw_mbytes_per_sec": 0, 00:21:22.081 "r_mbytes_per_sec": 0, 00:21:22.081 "w_mbytes_per_sec": 0 00:21:22.081 }, 00:21:22.081 "claimed": true, 00:21:22.081 "claim_type": "exclusive_write", 00:21:22.081 "zoned": false, 00:21:22.081 "supported_io_types": { 00:21:22.081 "read": true, 00:21:22.081 "write": true, 00:21:22.081 "unmap": true, 00:21:22.081 "flush": true, 00:21:22.081 "reset": true, 00:21:22.081 "nvme_admin": false, 00:21:22.081 "nvme_io": false, 00:21:22.081 "nvme_io_md": false, 00:21:22.081 "write_zeroes": true, 00:21:22.081 "zcopy": true, 00:21:22.081 "get_zone_info": false, 00:21:22.081 "zone_management": false, 00:21:22.081 "zone_append": false, 00:21:22.081 "compare": false, 00:21:22.081 "compare_and_write": false, 00:21:22.081 "abort": true, 00:21:22.081 "seek_hole": false, 00:21:22.081 "seek_data": false, 00:21:22.081 "copy": true, 00:21:22.081 "nvme_iov_md": false 00:21:22.081 }, 00:21:22.081 "memory_domains": [ 00:21:22.081 { 00:21:22.081 "dma_device_id": "system", 00:21:22.081 "dma_device_type": 1 00:21:22.081 }, 00:21:22.081 { 00:21:22.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.081 "dma_device_type": 2 00:21:22.081 } 00:21:22.081 ], 00:21:22.081 "driver_specific": {} 00:21:22.081 }' 00:21:22.081 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.081 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.081 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:22.081 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.081 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.339 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.339 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.339 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.339 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:22.339 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.339 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.339 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:22.339 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:22.339 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:22.339 20:35:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:22.906 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:22.906 "name": "BaseBdev3", 00:21:22.906 "aliases": [ 00:21:22.906 "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c" 00:21:22.906 ], 00:21:22.906 "product_name": "Malloc disk", 00:21:22.906 "block_size": 512, 00:21:22.906 "num_blocks": 65536, 00:21:22.906 "uuid": "20cc09ae-b1df-439c-ae92-bc7b8a70ba0c", 00:21:22.906 "assigned_rate_limits": { 00:21:22.906 "rw_ios_per_sec": 0, 00:21:22.906 "rw_mbytes_per_sec": 0, 00:21:22.906 "r_mbytes_per_sec": 0, 00:21:22.906 "w_mbytes_per_sec": 0 00:21:22.906 }, 00:21:22.906 "claimed": true, 00:21:22.906 "claim_type": "exclusive_write", 00:21:22.906 "zoned": false, 00:21:22.906 "supported_io_types": { 00:21:22.906 "read": true, 00:21:22.906 "write": true, 00:21:22.906 "unmap": true, 00:21:22.906 "flush": true, 00:21:22.906 "reset": true, 00:21:22.906 "nvme_admin": false, 00:21:22.906 "nvme_io": false, 00:21:22.906 "nvme_io_md": false, 00:21:22.906 "write_zeroes": true, 00:21:22.906 "zcopy": true, 00:21:22.906 "get_zone_info": false, 00:21:22.906 "zone_management": false, 00:21:22.906 "zone_append": false, 00:21:22.906 "compare": false, 00:21:22.906 "compare_and_write": false, 00:21:22.906 "abort": true, 00:21:22.906 "seek_hole": false, 00:21:22.906 "seek_data": false, 00:21:22.906 "copy": true, 00:21:22.906 "nvme_iov_md": false 00:21:22.906 }, 00:21:22.906 "memory_domains": [ 00:21:22.906 { 00:21:22.906 "dma_device_id": "system", 00:21:22.906 "dma_device_type": 1 00:21:22.906 }, 00:21:22.906 { 00:21:22.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.906 "dma_device_type": 2 00:21:22.906 } 00:21:22.906 ], 00:21:22.906 "driver_specific": {} 00:21:22.906 }' 00:21:22.906 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.906 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.906 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:22.906 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.906 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.906 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.906 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.906 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.165 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:23.165 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.165 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.165 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:23.165 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:23.165 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:23.165 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:23.423 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:23.423 "name": "BaseBdev4", 00:21:23.423 "aliases": [ 00:21:23.423 "9b7395b4-c606-4935-8fb9-ae16acdf20f0" 00:21:23.423 ], 00:21:23.423 "product_name": "Malloc disk", 00:21:23.423 "block_size": 512, 00:21:23.423 "num_blocks": 65536, 00:21:23.423 "uuid": "9b7395b4-c606-4935-8fb9-ae16acdf20f0", 00:21:23.423 "assigned_rate_limits": { 00:21:23.423 "rw_ios_per_sec": 0, 00:21:23.423 "rw_mbytes_per_sec": 0, 00:21:23.423 "r_mbytes_per_sec": 0, 00:21:23.423 "w_mbytes_per_sec": 0 00:21:23.423 }, 00:21:23.423 "claimed": true, 00:21:23.423 "claim_type": "exclusive_write", 00:21:23.423 "zoned": false, 00:21:23.423 "supported_io_types": { 00:21:23.423 "read": true, 00:21:23.423 "write": true, 00:21:23.423 "unmap": true, 00:21:23.423 "flush": true, 00:21:23.423 "reset": true, 00:21:23.423 "nvme_admin": false, 00:21:23.423 "nvme_io": false, 00:21:23.423 "nvme_io_md": false, 00:21:23.423 "write_zeroes": true, 00:21:23.423 "zcopy": true, 00:21:23.423 "get_zone_info": false, 00:21:23.423 "zone_management": false, 00:21:23.423 "zone_append": false, 00:21:23.423 "compare": false, 00:21:23.423 "compare_and_write": false, 00:21:23.423 "abort": true, 00:21:23.423 "seek_hole": false, 00:21:23.423 "seek_data": false, 00:21:23.423 "copy": true, 00:21:23.423 "nvme_iov_md": false 00:21:23.423 }, 00:21:23.423 "memory_domains": [ 00:21:23.423 { 00:21:23.423 "dma_device_id": "system", 00:21:23.423 "dma_device_type": 1 00:21:23.423 }, 00:21:23.423 { 00:21:23.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.423 "dma_device_type": 2 00:21:23.423 } 00:21:23.423 ], 00:21:23.423 "driver_specific": {} 00:21:23.423 }' 00:21:23.423 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:23.423 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:23.423 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:23.423 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:23.423 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:23.681 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:23.681 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.681 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.681 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:23.681 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.681 20:35:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.681 20:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:23.681 20:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:24.246 [2024-07-15 20:35:16.525164] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:24.246 [2024-07-15 20:35:16.525197] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:24.246 [2024-07-15 20:35:16.525256] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:24.246 [2024-07-15 20:35:16.525321] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:24.246 [2024-07-15 20:35:16.525333] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10b9850 name Existed_Raid, state offline 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1433449 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1433449 ']' 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1433449 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1433449 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1433449' 00:21:24.246 killing process with pid 1433449 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1433449 00:21:24.246 [2024-07-15 20:35:16.604762] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:24.246 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1433449 00:21:24.504 [2024-07-15 20:35:16.647447] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:24.504 20:35:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:24.504 00:21:24.504 real 0m40.369s 00:21:24.504 user 1m14.516s 00:21:24.504 sys 0m6.623s 00:21:24.504 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:24.504 20:35:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:24.504 ************************************ 00:21:24.504 END TEST raid_state_function_test_sb 00:21:24.504 ************************************ 00:21:24.763 20:35:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:24.763 20:35:16 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:21:24.763 20:35:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:24.763 20:35:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:24.763 20:35:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:24.763 ************************************ 00:21:24.763 START TEST raid_superblock_test 00:21:24.763 ************************************ 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1439364 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1439364 /var/tmp/spdk-raid.sock 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1439364 ']' 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:24.763 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:24.763 20:35:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:24.763 [2024-07-15 20:35:17.015897] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:21:24.763 [2024-07-15 20:35:17.015978] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1439364 ] 00:21:25.022 [2024-07-15 20:35:17.145624] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:25.022 [2024-07-15 20:35:17.247564] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:25.022 [2024-07-15 20:35:17.302602] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:25.022 [2024-07-15 20:35:17.302633] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:25.960 20:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:26.528 malloc1 00:21:26.528 20:35:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:26.798 [2024-07-15 20:35:19.035101] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:26.798 [2024-07-15 20:35:19.035151] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:26.798 [2024-07-15 20:35:19.035173] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4a570 00:21:26.798 [2024-07-15 20:35:19.035186] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:26.798 [2024-07-15 20:35:19.036939] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:26.798 [2024-07-15 20:35:19.036969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:26.798 pt1 00:21:26.798 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:26.798 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:26.798 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:26.798 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:26.798 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:26.798 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:26.798 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:26.798 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:26.798 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:27.057 malloc2 00:21:27.057 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:27.316 [2024-07-15 20:35:19.541180] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:27.316 [2024-07-15 20:35:19.541223] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:27.316 [2024-07-15 20:35:19.541246] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4b970 00:21:27.316 [2024-07-15 20:35:19.541259] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:27.316 [2024-07-15 20:35:19.542908] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:27.316 [2024-07-15 20:35:19.542942] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:27.316 pt2 00:21:27.316 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:27.316 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:27.316 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:21:27.316 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:21:27.316 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:27.316 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:27.316 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:27.316 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:27.316 20:35:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:27.885 malloc3 00:21:27.885 20:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:28.454 [2024-07-15 20:35:20.564834] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:28.454 [2024-07-15 20:35:20.564883] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:28.454 [2024-07-15 20:35:20.564901] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde2340 00:21:28.454 [2024-07-15 20:35:20.564914] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:28.454 [2024-07-15 20:35:20.566555] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:28.454 [2024-07-15 20:35:20.566583] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:28.454 pt3 00:21:28.454 20:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:28.454 20:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:28.454 20:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:21:28.454 20:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:21:28.454 20:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:28.454 20:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:28.454 20:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:28.454 20:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:28.454 20:35:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:28.712 malloc4 00:21:29.009 20:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:29.269 [2024-07-15 20:35:21.592134] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:29.269 [2024-07-15 20:35:21.592180] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:29.269 [2024-07-15 20:35:21.592201] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde4c60 00:21:29.269 [2024-07-15 20:35:21.592214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:29.269 [2024-07-15 20:35:21.593795] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:29.269 [2024-07-15 20:35:21.593828] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:29.269 pt4 00:21:29.269 20:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:29.269 20:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:29.269 20:35:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:29.838 [2024-07-15 20:35:22.105494] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:29.838 [2024-07-15 20:35:22.106858] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:29.838 [2024-07-15 20:35:22.106914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:29.838 [2024-07-15 20:35:22.106969] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:29.838 [2024-07-15 20:35:22.107143] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc42530 00:21:29.838 [2024-07-15 20:35:22.107155] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:29.838 [2024-07-15 20:35:22.107356] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc40770 00:21:29.838 [2024-07-15 20:35:22.107504] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc42530 00:21:29.838 [2024-07-15 20:35:22.107514] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc42530 00:21:29.838 [2024-07-15 20:35:22.107612] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:29.838 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:29.838 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:29.839 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:29.839 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:29.839 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:29.839 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:29.839 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:29.839 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:29.839 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:29.839 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.839 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.839 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.407 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.407 "name": "raid_bdev1", 00:21:30.407 "uuid": "b2a33878-823f-4ea0-ba8a-aee172d9e652", 00:21:30.407 "strip_size_kb": 64, 00:21:30.407 "state": "online", 00:21:30.407 "raid_level": "concat", 00:21:30.407 "superblock": true, 00:21:30.407 "num_base_bdevs": 4, 00:21:30.407 "num_base_bdevs_discovered": 4, 00:21:30.407 "num_base_bdevs_operational": 4, 00:21:30.407 "base_bdevs_list": [ 00:21:30.407 { 00:21:30.407 "name": "pt1", 00:21:30.407 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:30.407 "is_configured": true, 00:21:30.407 "data_offset": 2048, 00:21:30.407 "data_size": 63488 00:21:30.407 }, 00:21:30.407 { 00:21:30.407 "name": "pt2", 00:21:30.407 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:30.407 "is_configured": true, 00:21:30.407 "data_offset": 2048, 00:21:30.407 "data_size": 63488 00:21:30.407 }, 00:21:30.407 { 00:21:30.407 "name": "pt3", 00:21:30.407 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:30.407 "is_configured": true, 00:21:30.407 "data_offset": 2048, 00:21:30.407 "data_size": 63488 00:21:30.407 }, 00:21:30.407 { 00:21:30.407 "name": "pt4", 00:21:30.407 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:30.407 "is_configured": true, 00:21:30.407 "data_offset": 2048, 00:21:30.407 "data_size": 63488 00:21:30.407 } 00:21:30.407 ] 00:21:30.407 }' 00:21:30.407 20:35:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.407 20:35:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:30.975 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:30.975 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:30.975 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:30.975 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:30.975 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:30.975 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:30.975 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:30.975 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:31.542 [2024-07-15 20:35:23.798270] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:31.542 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:31.542 "name": "raid_bdev1", 00:21:31.542 "aliases": [ 00:21:31.542 "b2a33878-823f-4ea0-ba8a-aee172d9e652" 00:21:31.542 ], 00:21:31.542 "product_name": "Raid Volume", 00:21:31.542 "block_size": 512, 00:21:31.542 "num_blocks": 253952, 00:21:31.542 "uuid": "b2a33878-823f-4ea0-ba8a-aee172d9e652", 00:21:31.542 "assigned_rate_limits": { 00:21:31.542 "rw_ios_per_sec": 0, 00:21:31.542 "rw_mbytes_per_sec": 0, 00:21:31.542 "r_mbytes_per_sec": 0, 00:21:31.542 "w_mbytes_per_sec": 0 00:21:31.542 }, 00:21:31.542 "claimed": false, 00:21:31.542 "zoned": false, 00:21:31.542 "supported_io_types": { 00:21:31.542 "read": true, 00:21:31.542 "write": true, 00:21:31.542 "unmap": true, 00:21:31.542 "flush": true, 00:21:31.542 "reset": true, 00:21:31.542 "nvme_admin": false, 00:21:31.542 "nvme_io": false, 00:21:31.542 "nvme_io_md": false, 00:21:31.542 "write_zeroes": true, 00:21:31.542 "zcopy": false, 00:21:31.542 "get_zone_info": false, 00:21:31.542 "zone_management": false, 00:21:31.542 "zone_append": false, 00:21:31.542 "compare": false, 00:21:31.542 "compare_and_write": false, 00:21:31.542 "abort": false, 00:21:31.542 "seek_hole": false, 00:21:31.542 "seek_data": false, 00:21:31.542 "copy": false, 00:21:31.542 "nvme_iov_md": false 00:21:31.542 }, 00:21:31.542 "memory_domains": [ 00:21:31.542 { 00:21:31.542 "dma_device_id": "system", 00:21:31.542 "dma_device_type": 1 00:21:31.542 }, 00:21:31.542 { 00:21:31.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.542 "dma_device_type": 2 00:21:31.542 }, 00:21:31.542 { 00:21:31.542 "dma_device_id": "system", 00:21:31.542 "dma_device_type": 1 00:21:31.542 }, 00:21:31.542 { 00:21:31.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.542 "dma_device_type": 2 00:21:31.542 }, 00:21:31.542 { 00:21:31.542 "dma_device_id": "system", 00:21:31.542 "dma_device_type": 1 00:21:31.542 }, 00:21:31.542 { 00:21:31.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.542 "dma_device_type": 2 00:21:31.542 }, 00:21:31.542 { 00:21:31.542 "dma_device_id": "system", 00:21:31.542 "dma_device_type": 1 00:21:31.542 }, 00:21:31.542 { 00:21:31.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.542 "dma_device_type": 2 00:21:31.542 } 00:21:31.542 ], 00:21:31.542 "driver_specific": { 00:21:31.542 "raid": { 00:21:31.542 "uuid": "b2a33878-823f-4ea0-ba8a-aee172d9e652", 00:21:31.542 "strip_size_kb": 64, 00:21:31.542 "state": "online", 00:21:31.542 "raid_level": "concat", 00:21:31.542 "superblock": true, 00:21:31.542 "num_base_bdevs": 4, 00:21:31.542 "num_base_bdevs_discovered": 4, 00:21:31.542 "num_base_bdevs_operational": 4, 00:21:31.542 "base_bdevs_list": [ 00:21:31.542 { 00:21:31.542 "name": "pt1", 00:21:31.542 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:31.542 "is_configured": true, 00:21:31.542 "data_offset": 2048, 00:21:31.542 "data_size": 63488 00:21:31.542 }, 00:21:31.542 { 00:21:31.542 "name": "pt2", 00:21:31.542 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:31.542 "is_configured": true, 00:21:31.542 "data_offset": 2048, 00:21:31.542 "data_size": 63488 00:21:31.542 }, 00:21:31.542 { 00:21:31.542 "name": "pt3", 00:21:31.542 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:31.542 "is_configured": true, 00:21:31.542 "data_offset": 2048, 00:21:31.542 "data_size": 63488 00:21:31.542 }, 00:21:31.542 { 00:21:31.542 "name": "pt4", 00:21:31.542 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:31.542 "is_configured": true, 00:21:31.542 "data_offset": 2048, 00:21:31.542 "data_size": 63488 00:21:31.542 } 00:21:31.542 ] 00:21:31.542 } 00:21:31.542 } 00:21:31.542 }' 00:21:31.542 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:31.542 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:31.542 pt2 00:21:31.542 pt3 00:21:31.542 pt4' 00:21:31.542 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:31.542 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:31.542 20:35:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:32.110 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:32.110 "name": "pt1", 00:21:32.110 "aliases": [ 00:21:32.110 "00000000-0000-0000-0000-000000000001" 00:21:32.110 ], 00:21:32.110 "product_name": "passthru", 00:21:32.110 "block_size": 512, 00:21:32.110 "num_blocks": 65536, 00:21:32.110 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:32.110 "assigned_rate_limits": { 00:21:32.110 "rw_ios_per_sec": 0, 00:21:32.110 "rw_mbytes_per_sec": 0, 00:21:32.110 "r_mbytes_per_sec": 0, 00:21:32.110 "w_mbytes_per_sec": 0 00:21:32.110 }, 00:21:32.110 "claimed": true, 00:21:32.110 "claim_type": "exclusive_write", 00:21:32.110 "zoned": false, 00:21:32.110 "supported_io_types": { 00:21:32.110 "read": true, 00:21:32.110 "write": true, 00:21:32.110 "unmap": true, 00:21:32.110 "flush": true, 00:21:32.110 "reset": true, 00:21:32.110 "nvme_admin": false, 00:21:32.110 "nvme_io": false, 00:21:32.110 "nvme_io_md": false, 00:21:32.110 "write_zeroes": true, 00:21:32.110 "zcopy": true, 00:21:32.110 "get_zone_info": false, 00:21:32.110 "zone_management": false, 00:21:32.110 "zone_append": false, 00:21:32.110 "compare": false, 00:21:32.110 "compare_and_write": false, 00:21:32.110 "abort": true, 00:21:32.110 "seek_hole": false, 00:21:32.110 "seek_data": false, 00:21:32.110 "copy": true, 00:21:32.110 "nvme_iov_md": false 00:21:32.110 }, 00:21:32.110 "memory_domains": [ 00:21:32.110 { 00:21:32.110 "dma_device_id": "system", 00:21:32.110 "dma_device_type": 1 00:21:32.110 }, 00:21:32.110 { 00:21:32.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.110 "dma_device_type": 2 00:21:32.110 } 00:21:32.110 ], 00:21:32.110 "driver_specific": { 00:21:32.110 "passthru": { 00:21:32.110 "name": "pt1", 00:21:32.110 "base_bdev_name": "malloc1" 00:21:32.110 } 00:21:32.110 } 00:21:32.110 }' 00:21:32.110 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.110 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.370 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:32.370 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.370 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.370 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:32.370 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.370 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.629 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:32.629 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.629 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.629 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:32.629 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:32.629 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:32.629 20:35:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:32.889 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:32.889 "name": "pt2", 00:21:32.889 "aliases": [ 00:21:32.889 "00000000-0000-0000-0000-000000000002" 00:21:32.889 ], 00:21:32.889 "product_name": "passthru", 00:21:32.889 "block_size": 512, 00:21:32.889 "num_blocks": 65536, 00:21:32.889 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:32.889 "assigned_rate_limits": { 00:21:32.889 "rw_ios_per_sec": 0, 00:21:32.889 "rw_mbytes_per_sec": 0, 00:21:32.889 "r_mbytes_per_sec": 0, 00:21:32.889 "w_mbytes_per_sec": 0 00:21:32.889 }, 00:21:32.889 "claimed": true, 00:21:32.889 "claim_type": "exclusive_write", 00:21:32.889 "zoned": false, 00:21:32.889 "supported_io_types": { 00:21:32.889 "read": true, 00:21:32.889 "write": true, 00:21:32.889 "unmap": true, 00:21:32.889 "flush": true, 00:21:32.889 "reset": true, 00:21:32.889 "nvme_admin": false, 00:21:32.889 "nvme_io": false, 00:21:32.889 "nvme_io_md": false, 00:21:32.889 "write_zeroes": true, 00:21:32.889 "zcopy": true, 00:21:32.889 "get_zone_info": false, 00:21:32.889 "zone_management": false, 00:21:32.889 "zone_append": false, 00:21:32.889 "compare": false, 00:21:32.889 "compare_and_write": false, 00:21:32.889 "abort": true, 00:21:32.889 "seek_hole": false, 00:21:32.889 "seek_data": false, 00:21:32.889 "copy": true, 00:21:32.889 "nvme_iov_md": false 00:21:32.889 }, 00:21:32.889 "memory_domains": [ 00:21:32.889 { 00:21:32.889 "dma_device_id": "system", 00:21:32.889 "dma_device_type": 1 00:21:32.889 }, 00:21:32.889 { 00:21:32.889 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.889 "dma_device_type": 2 00:21:32.889 } 00:21:32.889 ], 00:21:32.889 "driver_specific": { 00:21:32.889 "passthru": { 00:21:32.889 "name": "pt2", 00:21:32.889 "base_bdev_name": "malloc2" 00:21:32.889 } 00:21:32.889 } 00:21:32.889 }' 00:21:32.889 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.889 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.889 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:32.889 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.889 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.147 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:33.147 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.147 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.147 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:33.148 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.148 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.148 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:33.148 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:33.148 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:33.148 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:33.406 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:33.406 "name": "pt3", 00:21:33.406 "aliases": [ 00:21:33.406 "00000000-0000-0000-0000-000000000003" 00:21:33.406 ], 00:21:33.406 "product_name": "passthru", 00:21:33.406 "block_size": 512, 00:21:33.406 "num_blocks": 65536, 00:21:33.406 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:33.406 "assigned_rate_limits": { 00:21:33.406 "rw_ios_per_sec": 0, 00:21:33.406 "rw_mbytes_per_sec": 0, 00:21:33.406 "r_mbytes_per_sec": 0, 00:21:33.406 "w_mbytes_per_sec": 0 00:21:33.406 }, 00:21:33.406 "claimed": true, 00:21:33.406 "claim_type": "exclusive_write", 00:21:33.406 "zoned": false, 00:21:33.406 "supported_io_types": { 00:21:33.406 "read": true, 00:21:33.406 "write": true, 00:21:33.406 "unmap": true, 00:21:33.406 "flush": true, 00:21:33.406 "reset": true, 00:21:33.406 "nvme_admin": false, 00:21:33.406 "nvme_io": false, 00:21:33.406 "nvme_io_md": false, 00:21:33.406 "write_zeroes": true, 00:21:33.406 "zcopy": true, 00:21:33.406 "get_zone_info": false, 00:21:33.406 "zone_management": false, 00:21:33.406 "zone_append": false, 00:21:33.406 "compare": false, 00:21:33.406 "compare_and_write": false, 00:21:33.406 "abort": true, 00:21:33.406 "seek_hole": false, 00:21:33.406 "seek_data": false, 00:21:33.406 "copy": true, 00:21:33.406 "nvme_iov_md": false 00:21:33.406 }, 00:21:33.406 "memory_domains": [ 00:21:33.406 { 00:21:33.406 "dma_device_id": "system", 00:21:33.406 "dma_device_type": 1 00:21:33.406 }, 00:21:33.406 { 00:21:33.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.406 "dma_device_type": 2 00:21:33.406 } 00:21:33.406 ], 00:21:33.406 "driver_specific": { 00:21:33.406 "passthru": { 00:21:33.406 "name": "pt3", 00:21:33.406 "base_bdev_name": "malloc3" 00:21:33.406 } 00:21:33.406 } 00:21:33.406 }' 00:21:33.406 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.406 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.406 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:33.665 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.665 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.665 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:33.665 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.665 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.665 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:33.665 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.665 20:35:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.665 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:33.665 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:33.665 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:33.665 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:33.924 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:33.924 "name": "pt4", 00:21:33.924 "aliases": [ 00:21:33.924 "00000000-0000-0000-0000-000000000004" 00:21:33.924 ], 00:21:33.924 "product_name": "passthru", 00:21:33.924 "block_size": 512, 00:21:33.924 "num_blocks": 65536, 00:21:33.924 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:33.924 "assigned_rate_limits": { 00:21:33.924 "rw_ios_per_sec": 0, 00:21:33.924 "rw_mbytes_per_sec": 0, 00:21:33.924 "r_mbytes_per_sec": 0, 00:21:33.924 "w_mbytes_per_sec": 0 00:21:33.924 }, 00:21:33.924 "claimed": true, 00:21:33.924 "claim_type": "exclusive_write", 00:21:33.924 "zoned": false, 00:21:33.924 "supported_io_types": { 00:21:33.924 "read": true, 00:21:33.924 "write": true, 00:21:33.924 "unmap": true, 00:21:33.924 "flush": true, 00:21:33.924 "reset": true, 00:21:33.924 "nvme_admin": false, 00:21:33.924 "nvme_io": false, 00:21:33.924 "nvme_io_md": false, 00:21:33.924 "write_zeroes": true, 00:21:33.924 "zcopy": true, 00:21:33.924 "get_zone_info": false, 00:21:33.924 "zone_management": false, 00:21:33.924 "zone_append": false, 00:21:33.924 "compare": false, 00:21:33.924 "compare_and_write": false, 00:21:33.924 "abort": true, 00:21:33.924 "seek_hole": false, 00:21:33.924 "seek_data": false, 00:21:33.924 "copy": true, 00:21:33.924 "nvme_iov_md": false 00:21:33.924 }, 00:21:33.924 "memory_domains": [ 00:21:33.924 { 00:21:33.924 "dma_device_id": "system", 00:21:33.924 "dma_device_type": 1 00:21:33.924 }, 00:21:33.924 { 00:21:33.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.924 "dma_device_type": 2 00:21:33.924 } 00:21:33.924 ], 00:21:33.924 "driver_specific": { 00:21:33.924 "passthru": { 00:21:33.924 "name": "pt4", 00:21:33.924 "base_bdev_name": "malloc4" 00:21:33.924 } 00:21:33.924 } 00:21:33.924 }' 00:21:33.924 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.183 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.183 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:34.183 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.183 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.183 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:34.183 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.183 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.183 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:34.183 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.442 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.442 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:34.442 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:34.442 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:34.702 [2024-07-15 20:35:26.870409] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:34.702 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b2a33878-823f-4ea0-ba8a-aee172d9e652 00:21:34.702 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b2a33878-823f-4ea0-ba8a-aee172d9e652 ']' 00:21:34.702 20:35:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:34.961 [2024-07-15 20:35:27.114768] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:34.961 [2024-07-15 20:35:27.114789] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:34.961 [2024-07-15 20:35:27.114835] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:34.961 [2024-07-15 20:35:27.114898] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:34.961 [2024-07-15 20:35:27.114910] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc42530 name raid_bdev1, state offline 00:21:34.961 20:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.961 20:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:35.220 20:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:35.220 20:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:35.220 20:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:35.220 20:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:35.479 20:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:35.479 20:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:35.479 20:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:35.479 20:35:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:35.737 20:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:35.737 20:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:35.996 20:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:35.996 20:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:36.255 20:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:36.255 20:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:36.255 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:21:36.255 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:36.255 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.255 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:36.255 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.255 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:36.256 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.256 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:36.256 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.256 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:36.256 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:36.514 [2024-07-15 20:35:28.851275] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:36.514 [2024-07-15 20:35:28.852615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:36.514 [2024-07-15 20:35:28.852657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:36.514 [2024-07-15 20:35:28.852698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:36.514 [2024-07-15 20:35:28.852743] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:36.514 [2024-07-15 20:35:28.852780] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:36.514 [2024-07-15 20:35:28.852803] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:36.514 [2024-07-15 20:35:28.852825] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:36.514 [2024-07-15 20:35:28.852842] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:36.514 [2024-07-15 20:35:28.852853] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdedff0 name raid_bdev1, state configuring 00:21:36.514 request: 00:21:36.514 { 00:21:36.514 "name": "raid_bdev1", 00:21:36.514 "raid_level": "concat", 00:21:36.514 "base_bdevs": [ 00:21:36.514 "malloc1", 00:21:36.514 "malloc2", 00:21:36.514 "malloc3", 00:21:36.514 "malloc4" 00:21:36.514 ], 00:21:36.514 "strip_size_kb": 64, 00:21:36.514 "superblock": false, 00:21:36.514 "method": "bdev_raid_create", 00:21:36.514 "req_id": 1 00:21:36.514 } 00:21:36.514 Got JSON-RPC error response 00:21:36.514 response: 00:21:36.514 { 00:21:36.514 "code": -17, 00:21:36.514 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:36.514 } 00:21:36.514 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:21:36.514 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:36.514 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:36.514 20:35:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:36.514 20:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.514 20:35:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:36.773 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:36.773 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:36.773 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:37.031 [2024-07-15 20:35:29.336497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:37.031 [2024-07-15 20:35:29.336544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:37.031 [2024-07-15 20:35:29.336566] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4a7a0 00:21:37.031 [2024-07-15 20:35:29.336579] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:37.031 [2024-07-15 20:35:29.338212] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:37.031 [2024-07-15 20:35:29.338241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:37.031 [2024-07-15 20:35:29.338307] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:37.031 [2024-07-15 20:35:29.338335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:37.031 pt1 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.031 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:37.289 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:37.289 "name": "raid_bdev1", 00:21:37.289 "uuid": "b2a33878-823f-4ea0-ba8a-aee172d9e652", 00:21:37.289 "strip_size_kb": 64, 00:21:37.289 "state": "configuring", 00:21:37.289 "raid_level": "concat", 00:21:37.289 "superblock": true, 00:21:37.289 "num_base_bdevs": 4, 00:21:37.289 "num_base_bdevs_discovered": 1, 00:21:37.289 "num_base_bdevs_operational": 4, 00:21:37.289 "base_bdevs_list": [ 00:21:37.289 { 00:21:37.289 "name": "pt1", 00:21:37.289 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:37.289 "is_configured": true, 00:21:37.289 "data_offset": 2048, 00:21:37.289 "data_size": 63488 00:21:37.289 }, 00:21:37.289 { 00:21:37.289 "name": null, 00:21:37.289 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:37.289 "is_configured": false, 00:21:37.289 "data_offset": 2048, 00:21:37.289 "data_size": 63488 00:21:37.289 }, 00:21:37.289 { 00:21:37.289 "name": null, 00:21:37.289 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:37.289 "is_configured": false, 00:21:37.289 "data_offset": 2048, 00:21:37.289 "data_size": 63488 00:21:37.289 }, 00:21:37.289 { 00:21:37.289 "name": null, 00:21:37.289 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:37.289 "is_configured": false, 00:21:37.289 "data_offset": 2048, 00:21:37.289 "data_size": 63488 00:21:37.289 } 00:21:37.289 ] 00:21:37.289 }' 00:21:37.289 20:35:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:37.289 20:35:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:37.857 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:21:37.857 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:38.115 [2024-07-15 20:35:30.455495] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:38.115 [2024-07-15 20:35:30.455548] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:38.116 [2024-07-15 20:35:30.455568] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc41ea0 00:21:38.116 [2024-07-15 20:35:30.455581] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:38.116 [2024-07-15 20:35:30.455947] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:38.116 [2024-07-15 20:35:30.455966] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:38.116 [2024-07-15 20:35:30.456033] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:38.116 [2024-07-15 20:35:30.456053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:38.116 pt2 00:21:38.116 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:38.374 [2024-07-15 20:35:30.700137] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:38.374 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:38.374 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:38.374 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:38.374 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:38.374 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:38.374 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:38.374 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:38.375 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:38.375 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:38.375 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:38.375 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.375 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:38.634 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:38.634 "name": "raid_bdev1", 00:21:38.634 "uuid": "b2a33878-823f-4ea0-ba8a-aee172d9e652", 00:21:38.634 "strip_size_kb": 64, 00:21:38.634 "state": "configuring", 00:21:38.634 "raid_level": "concat", 00:21:38.634 "superblock": true, 00:21:38.634 "num_base_bdevs": 4, 00:21:38.634 "num_base_bdevs_discovered": 1, 00:21:38.634 "num_base_bdevs_operational": 4, 00:21:38.634 "base_bdevs_list": [ 00:21:38.634 { 00:21:38.634 "name": "pt1", 00:21:38.634 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:38.634 "is_configured": true, 00:21:38.634 "data_offset": 2048, 00:21:38.634 "data_size": 63488 00:21:38.634 }, 00:21:38.634 { 00:21:38.634 "name": null, 00:21:38.634 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:38.634 "is_configured": false, 00:21:38.634 "data_offset": 2048, 00:21:38.634 "data_size": 63488 00:21:38.634 }, 00:21:38.634 { 00:21:38.634 "name": null, 00:21:38.634 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:38.634 "is_configured": false, 00:21:38.634 "data_offset": 2048, 00:21:38.634 "data_size": 63488 00:21:38.634 }, 00:21:38.634 { 00:21:38.634 "name": null, 00:21:38.634 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:38.634 "is_configured": false, 00:21:38.634 "data_offset": 2048, 00:21:38.634 "data_size": 63488 00:21:38.634 } 00:21:38.634 ] 00:21:38.634 }' 00:21:38.634 20:35:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:38.634 20:35:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:39.201 20:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:39.201 20:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:39.201 20:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:39.460 [2024-07-15 20:35:31.791022] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:39.460 [2024-07-15 20:35:31.791073] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.460 [2024-07-15 20:35:31.791092] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc40ec0 00:21:39.460 [2024-07-15 20:35:31.791105] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.460 [2024-07-15 20:35:31.791444] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.460 [2024-07-15 20:35:31.791462] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:39.460 [2024-07-15 20:35:31.791524] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:39.460 [2024-07-15 20:35:31.791544] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:39.460 pt2 00:21:39.460 20:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:39.460 20:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:39.460 20:35:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:39.719 [2024-07-15 20:35:32.031659] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:39.719 [2024-07-15 20:35:32.031698] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.719 [2024-07-15 20:35:32.031715] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc410f0 00:21:39.719 [2024-07-15 20:35:32.031727] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.719 [2024-07-15 20:35:32.032047] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.719 [2024-07-15 20:35:32.032066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:39.719 [2024-07-15 20:35:32.032119] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:39.719 [2024-07-15 20:35:32.032137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:39.719 pt3 00:21:39.719 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:39.719 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:39.719 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:39.979 [2024-07-15 20:35:32.272310] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:39.979 [2024-07-15 20:35:32.272354] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.979 [2024-07-15 20:35:32.272372] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc49af0 00:21:39.979 [2024-07-15 20:35:32.272384] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.979 [2024-07-15 20:35:32.272712] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.979 [2024-07-15 20:35:32.272729] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:39.979 [2024-07-15 20:35:32.272788] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:39.979 [2024-07-15 20:35:32.272807] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:39.979 [2024-07-15 20:35:32.272942] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc438f0 00:21:39.979 [2024-07-15 20:35:32.272953] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:39.979 [2024-07-15 20:35:32.273121] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc43150 00:21:39.979 [2024-07-15 20:35:32.273251] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc438f0 00:21:39.979 [2024-07-15 20:35:32.273260] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc438f0 00:21:39.979 [2024-07-15 20:35:32.273357] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:39.979 pt4 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.979 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.239 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.239 "name": "raid_bdev1", 00:21:40.239 "uuid": "b2a33878-823f-4ea0-ba8a-aee172d9e652", 00:21:40.239 "strip_size_kb": 64, 00:21:40.239 "state": "online", 00:21:40.239 "raid_level": "concat", 00:21:40.239 "superblock": true, 00:21:40.239 "num_base_bdevs": 4, 00:21:40.239 "num_base_bdevs_discovered": 4, 00:21:40.239 "num_base_bdevs_operational": 4, 00:21:40.239 "base_bdevs_list": [ 00:21:40.239 { 00:21:40.239 "name": "pt1", 00:21:40.239 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:40.239 "is_configured": true, 00:21:40.239 "data_offset": 2048, 00:21:40.239 "data_size": 63488 00:21:40.239 }, 00:21:40.239 { 00:21:40.239 "name": "pt2", 00:21:40.239 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:40.239 "is_configured": true, 00:21:40.239 "data_offset": 2048, 00:21:40.239 "data_size": 63488 00:21:40.239 }, 00:21:40.239 { 00:21:40.239 "name": "pt3", 00:21:40.239 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:40.239 "is_configured": true, 00:21:40.239 "data_offset": 2048, 00:21:40.239 "data_size": 63488 00:21:40.239 }, 00:21:40.239 { 00:21:40.239 "name": "pt4", 00:21:40.239 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:40.239 "is_configured": true, 00:21:40.239 "data_offset": 2048, 00:21:40.239 "data_size": 63488 00:21:40.239 } 00:21:40.239 ] 00:21:40.239 }' 00:21:40.239 20:35:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.239 20:35:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.818 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:40.818 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:40.818 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:40.818 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:40.818 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:40.818 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:40.818 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:40.818 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:41.081 [2024-07-15 20:35:33.371538] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:41.081 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:41.081 "name": "raid_bdev1", 00:21:41.081 "aliases": [ 00:21:41.081 "b2a33878-823f-4ea0-ba8a-aee172d9e652" 00:21:41.081 ], 00:21:41.081 "product_name": "Raid Volume", 00:21:41.081 "block_size": 512, 00:21:41.081 "num_blocks": 253952, 00:21:41.081 "uuid": "b2a33878-823f-4ea0-ba8a-aee172d9e652", 00:21:41.081 "assigned_rate_limits": { 00:21:41.081 "rw_ios_per_sec": 0, 00:21:41.081 "rw_mbytes_per_sec": 0, 00:21:41.082 "r_mbytes_per_sec": 0, 00:21:41.082 "w_mbytes_per_sec": 0 00:21:41.082 }, 00:21:41.082 "claimed": false, 00:21:41.082 "zoned": false, 00:21:41.082 "supported_io_types": { 00:21:41.082 "read": true, 00:21:41.082 "write": true, 00:21:41.082 "unmap": true, 00:21:41.082 "flush": true, 00:21:41.082 "reset": true, 00:21:41.082 "nvme_admin": false, 00:21:41.082 "nvme_io": false, 00:21:41.082 "nvme_io_md": false, 00:21:41.082 "write_zeroes": true, 00:21:41.082 "zcopy": false, 00:21:41.082 "get_zone_info": false, 00:21:41.082 "zone_management": false, 00:21:41.082 "zone_append": false, 00:21:41.082 "compare": false, 00:21:41.082 "compare_and_write": false, 00:21:41.082 "abort": false, 00:21:41.082 "seek_hole": false, 00:21:41.082 "seek_data": false, 00:21:41.082 "copy": false, 00:21:41.082 "nvme_iov_md": false 00:21:41.082 }, 00:21:41.082 "memory_domains": [ 00:21:41.082 { 00:21:41.082 "dma_device_id": "system", 00:21:41.082 "dma_device_type": 1 00:21:41.082 }, 00:21:41.082 { 00:21:41.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:41.082 "dma_device_type": 2 00:21:41.082 }, 00:21:41.082 { 00:21:41.082 "dma_device_id": "system", 00:21:41.082 "dma_device_type": 1 00:21:41.082 }, 00:21:41.082 { 00:21:41.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:41.082 "dma_device_type": 2 00:21:41.082 }, 00:21:41.082 { 00:21:41.082 "dma_device_id": "system", 00:21:41.082 "dma_device_type": 1 00:21:41.082 }, 00:21:41.082 { 00:21:41.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:41.082 "dma_device_type": 2 00:21:41.082 }, 00:21:41.082 { 00:21:41.082 "dma_device_id": "system", 00:21:41.082 "dma_device_type": 1 00:21:41.082 }, 00:21:41.082 { 00:21:41.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:41.082 "dma_device_type": 2 00:21:41.082 } 00:21:41.082 ], 00:21:41.082 "driver_specific": { 00:21:41.082 "raid": { 00:21:41.082 "uuid": "b2a33878-823f-4ea0-ba8a-aee172d9e652", 00:21:41.082 "strip_size_kb": 64, 00:21:41.082 "state": "online", 00:21:41.082 "raid_level": "concat", 00:21:41.082 "superblock": true, 00:21:41.082 "num_base_bdevs": 4, 00:21:41.082 "num_base_bdevs_discovered": 4, 00:21:41.082 "num_base_bdevs_operational": 4, 00:21:41.082 "base_bdevs_list": [ 00:21:41.082 { 00:21:41.082 "name": "pt1", 00:21:41.082 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:41.082 "is_configured": true, 00:21:41.082 "data_offset": 2048, 00:21:41.082 "data_size": 63488 00:21:41.082 }, 00:21:41.082 { 00:21:41.082 "name": "pt2", 00:21:41.082 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:41.082 "is_configured": true, 00:21:41.082 "data_offset": 2048, 00:21:41.082 "data_size": 63488 00:21:41.082 }, 00:21:41.082 { 00:21:41.082 "name": "pt3", 00:21:41.082 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:41.082 "is_configured": true, 00:21:41.082 "data_offset": 2048, 00:21:41.082 "data_size": 63488 00:21:41.082 }, 00:21:41.082 { 00:21:41.082 "name": "pt4", 00:21:41.082 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:41.082 "is_configured": true, 00:21:41.082 "data_offset": 2048, 00:21:41.082 "data_size": 63488 00:21:41.082 } 00:21:41.082 ] 00:21:41.082 } 00:21:41.082 } 00:21:41.082 }' 00:21:41.082 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:41.082 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:41.082 pt2 00:21:41.082 pt3 00:21:41.082 pt4' 00:21:41.082 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:41.082 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:41.082 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:41.340 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:41.340 "name": "pt1", 00:21:41.340 "aliases": [ 00:21:41.340 "00000000-0000-0000-0000-000000000001" 00:21:41.340 ], 00:21:41.340 "product_name": "passthru", 00:21:41.340 "block_size": 512, 00:21:41.340 "num_blocks": 65536, 00:21:41.340 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:41.340 "assigned_rate_limits": { 00:21:41.340 "rw_ios_per_sec": 0, 00:21:41.340 "rw_mbytes_per_sec": 0, 00:21:41.340 "r_mbytes_per_sec": 0, 00:21:41.340 "w_mbytes_per_sec": 0 00:21:41.340 }, 00:21:41.340 "claimed": true, 00:21:41.340 "claim_type": "exclusive_write", 00:21:41.340 "zoned": false, 00:21:41.340 "supported_io_types": { 00:21:41.340 "read": true, 00:21:41.340 "write": true, 00:21:41.340 "unmap": true, 00:21:41.340 "flush": true, 00:21:41.340 "reset": true, 00:21:41.340 "nvme_admin": false, 00:21:41.340 "nvme_io": false, 00:21:41.340 "nvme_io_md": false, 00:21:41.340 "write_zeroes": true, 00:21:41.340 "zcopy": true, 00:21:41.340 "get_zone_info": false, 00:21:41.340 "zone_management": false, 00:21:41.340 "zone_append": false, 00:21:41.340 "compare": false, 00:21:41.340 "compare_and_write": false, 00:21:41.340 "abort": true, 00:21:41.340 "seek_hole": false, 00:21:41.340 "seek_data": false, 00:21:41.340 "copy": true, 00:21:41.340 "nvme_iov_md": false 00:21:41.340 }, 00:21:41.340 "memory_domains": [ 00:21:41.340 { 00:21:41.340 "dma_device_id": "system", 00:21:41.340 "dma_device_type": 1 00:21:41.340 }, 00:21:41.340 { 00:21:41.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:41.340 "dma_device_type": 2 00:21:41.340 } 00:21:41.340 ], 00:21:41.340 "driver_specific": { 00:21:41.340 "passthru": { 00:21:41.340 "name": "pt1", 00:21:41.341 "base_bdev_name": "malloc1" 00:21:41.341 } 00:21:41.341 } 00:21:41.341 }' 00:21:41.341 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:41.599 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:41.599 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:41.599 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:41.599 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:41.599 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:41.599 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:41.599 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:41.599 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:41.599 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:41.857 20:35:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:41.857 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:41.857 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:41.857 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:41.857 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:42.115 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:42.115 "name": "pt2", 00:21:42.115 "aliases": [ 00:21:42.115 "00000000-0000-0000-0000-000000000002" 00:21:42.115 ], 00:21:42.115 "product_name": "passthru", 00:21:42.115 "block_size": 512, 00:21:42.115 "num_blocks": 65536, 00:21:42.115 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:42.115 "assigned_rate_limits": { 00:21:42.115 "rw_ios_per_sec": 0, 00:21:42.115 "rw_mbytes_per_sec": 0, 00:21:42.115 "r_mbytes_per_sec": 0, 00:21:42.115 "w_mbytes_per_sec": 0 00:21:42.115 }, 00:21:42.115 "claimed": true, 00:21:42.115 "claim_type": "exclusive_write", 00:21:42.115 "zoned": false, 00:21:42.115 "supported_io_types": { 00:21:42.115 "read": true, 00:21:42.115 "write": true, 00:21:42.115 "unmap": true, 00:21:42.115 "flush": true, 00:21:42.115 "reset": true, 00:21:42.115 "nvme_admin": false, 00:21:42.115 "nvme_io": false, 00:21:42.115 "nvme_io_md": false, 00:21:42.115 "write_zeroes": true, 00:21:42.115 "zcopy": true, 00:21:42.115 "get_zone_info": false, 00:21:42.115 "zone_management": false, 00:21:42.115 "zone_append": false, 00:21:42.115 "compare": false, 00:21:42.115 "compare_and_write": false, 00:21:42.115 "abort": true, 00:21:42.115 "seek_hole": false, 00:21:42.115 "seek_data": false, 00:21:42.116 "copy": true, 00:21:42.116 "nvme_iov_md": false 00:21:42.116 }, 00:21:42.116 "memory_domains": [ 00:21:42.116 { 00:21:42.116 "dma_device_id": "system", 00:21:42.116 "dma_device_type": 1 00:21:42.116 }, 00:21:42.116 { 00:21:42.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.116 "dma_device_type": 2 00:21:42.116 } 00:21:42.116 ], 00:21:42.116 "driver_specific": { 00:21:42.116 "passthru": { 00:21:42.116 "name": "pt2", 00:21:42.116 "base_bdev_name": "malloc2" 00:21:42.116 } 00:21:42.116 } 00:21:42.116 }' 00:21:42.116 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:42.116 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:42.116 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:42.116 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:42.116 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:42.116 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:42.116 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:42.374 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:42.374 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:42.374 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:42.374 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:42.374 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:42.374 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:42.374 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:42.374 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:42.632 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:42.632 "name": "pt3", 00:21:42.632 "aliases": [ 00:21:42.632 "00000000-0000-0000-0000-000000000003" 00:21:42.632 ], 00:21:42.632 "product_name": "passthru", 00:21:42.632 "block_size": 512, 00:21:42.632 "num_blocks": 65536, 00:21:42.632 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:42.632 "assigned_rate_limits": { 00:21:42.632 "rw_ios_per_sec": 0, 00:21:42.632 "rw_mbytes_per_sec": 0, 00:21:42.632 "r_mbytes_per_sec": 0, 00:21:42.632 "w_mbytes_per_sec": 0 00:21:42.632 }, 00:21:42.632 "claimed": true, 00:21:42.632 "claim_type": "exclusive_write", 00:21:42.632 "zoned": false, 00:21:42.632 "supported_io_types": { 00:21:42.632 "read": true, 00:21:42.632 "write": true, 00:21:42.632 "unmap": true, 00:21:42.632 "flush": true, 00:21:42.632 "reset": true, 00:21:42.632 "nvme_admin": false, 00:21:42.632 "nvme_io": false, 00:21:42.632 "nvme_io_md": false, 00:21:42.633 "write_zeroes": true, 00:21:42.633 "zcopy": true, 00:21:42.633 "get_zone_info": false, 00:21:42.633 "zone_management": false, 00:21:42.633 "zone_append": false, 00:21:42.633 "compare": false, 00:21:42.633 "compare_and_write": false, 00:21:42.633 "abort": true, 00:21:42.633 "seek_hole": false, 00:21:42.633 "seek_data": false, 00:21:42.633 "copy": true, 00:21:42.633 "nvme_iov_md": false 00:21:42.633 }, 00:21:42.633 "memory_domains": [ 00:21:42.633 { 00:21:42.633 "dma_device_id": "system", 00:21:42.633 "dma_device_type": 1 00:21:42.633 }, 00:21:42.633 { 00:21:42.633 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.633 "dma_device_type": 2 00:21:42.633 } 00:21:42.633 ], 00:21:42.633 "driver_specific": { 00:21:42.633 "passthru": { 00:21:42.633 "name": "pt3", 00:21:42.633 "base_bdev_name": "malloc3" 00:21:42.633 } 00:21:42.633 } 00:21:42.633 }' 00:21:42.633 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:42.633 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:42.633 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:42.633 20:35:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:42.889 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:43.172 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:43.172 "name": "pt4", 00:21:43.172 "aliases": [ 00:21:43.172 "00000000-0000-0000-0000-000000000004" 00:21:43.172 ], 00:21:43.172 "product_name": "passthru", 00:21:43.172 "block_size": 512, 00:21:43.172 "num_blocks": 65536, 00:21:43.172 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:43.172 "assigned_rate_limits": { 00:21:43.172 "rw_ios_per_sec": 0, 00:21:43.172 "rw_mbytes_per_sec": 0, 00:21:43.172 "r_mbytes_per_sec": 0, 00:21:43.172 "w_mbytes_per_sec": 0 00:21:43.172 }, 00:21:43.172 "claimed": true, 00:21:43.172 "claim_type": "exclusive_write", 00:21:43.172 "zoned": false, 00:21:43.172 "supported_io_types": { 00:21:43.172 "read": true, 00:21:43.172 "write": true, 00:21:43.172 "unmap": true, 00:21:43.172 "flush": true, 00:21:43.172 "reset": true, 00:21:43.172 "nvme_admin": false, 00:21:43.172 "nvme_io": false, 00:21:43.172 "nvme_io_md": false, 00:21:43.172 "write_zeroes": true, 00:21:43.172 "zcopy": true, 00:21:43.172 "get_zone_info": false, 00:21:43.172 "zone_management": false, 00:21:43.172 "zone_append": false, 00:21:43.172 "compare": false, 00:21:43.172 "compare_and_write": false, 00:21:43.172 "abort": true, 00:21:43.172 "seek_hole": false, 00:21:43.172 "seek_data": false, 00:21:43.172 "copy": true, 00:21:43.172 "nvme_iov_md": false 00:21:43.172 }, 00:21:43.172 "memory_domains": [ 00:21:43.172 { 00:21:43.172 "dma_device_id": "system", 00:21:43.172 "dma_device_type": 1 00:21:43.172 }, 00:21:43.172 { 00:21:43.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.172 "dma_device_type": 2 00:21:43.172 } 00:21:43.172 ], 00:21:43.172 "driver_specific": { 00:21:43.172 "passthru": { 00:21:43.172 "name": "pt4", 00:21:43.172 "base_bdev_name": "malloc4" 00:21:43.172 } 00:21:43.172 } 00:21:43.172 }' 00:21:43.172 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:43.172 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:43.436 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:43.436 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:43.436 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:43.436 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:43.436 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:43.436 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:43.436 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:43.436 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:43.436 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:43.695 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:43.695 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:43.695 20:35:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:43.695 [2024-07-15 20:35:36.050645] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b2a33878-823f-4ea0-ba8a-aee172d9e652 '!=' b2a33878-823f-4ea0-ba8a-aee172d9e652 ']' 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1439364 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1439364 ']' 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1439364 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1439364 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1439364' 00:21:43.954 killing process with pid 1439364 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1439364 00:21:43.954 [2024-07-15 20:35:36.126160] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:43.954 [2024-07-15 20:35:36.126223] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:43.954 [2024-07-15 20:35:36.126283] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:43.954 [2024-07-15 20:35:36.126296] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc438f0 name raid_bdev1, state offline 00:21:43.954 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1439364 00:21:43.954 [2024-07-15 20:35:36.165226] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:44.214 20:35:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:21:44.214 00:21:44.214 real 0m19.426s 00:21:44.214 user 0m35.272s 00:21:44.214 sys 0m3.310s 00:21:44.214 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:44.214 20:35:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:44.214 ************************************ 00:21:44.214 END TEST raid_superblock_test 00:21:44.214 ************************************ 00:21:44.214 20:35:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:44.214 20:35:36 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:21:44.214 20:35:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:44.214 20:35:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:44.214 20:35:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:44.214 ************************************ 00:21:44.214 START TEST raid_read_error_test 00:21:44.214 ************************************ 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.BO6Pvv5DKh 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1442143 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1442143 /var/tmp/spdk-raid.sock 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1442143 ']' 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:44.214 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:44.214 20:35:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:44.214 [2024-07-15 20:35:36.547630] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:21:44.214 [2024-07-15 20:35:36.547695] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1442143 ] 00:21:44.473 [2024-07-15 20:35:36.666708] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:44.473 [2024-07-15 20:35:36.771605] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:44.473 [2024-07-15 20:35:36.830817] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:44.473 [2024-07-15 20:35:36.830854] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:44.732 20:35:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:44.732 20:35:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:44.732 20:35:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:44.732 20:35:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:44.991 BaseBdev1_malloc 00:21:44.991 20:35:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:45.249 true 00:21:45.249 20:35:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:45.509 [2024-07-15 20:35:37.728888] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:45.509 [2024-07-15 20:35:37.728944] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:45.509 [2024-07-15 20:35:37.728966] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x224a0d0 00:21:45.509 [2024-07-15 20:35:37.728978] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:45.509 [2024-07-15 20:35:37.730857] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:45.509 [2024-07-15 20:35:37.730886] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:45.509 BaseBdev1 00:21:45.509 20:35:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:45.509 20:35:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:45.769 BaseBdev2_malloc 00:21:45.769 20:35:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:46.028 true 00:21:46.028 20:35:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:46.288 [2024-07-15 20:35:38.464651] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:46.288 [2024-07-15 20:35:38.464695] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:46.288 [2024-07-15 20:35:38.464716] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x224e910 00:21:46.288 [2024-07-15 20:35:38.464728] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:46.288 [2024-07-15 20:35:38.466349] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:46.288 [2024-07-15 20:35:38.466378] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:46.288 BaseBdev2 00:21:46.288 20:35:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:46.288 20:35:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:46.548 BaseBdev3_malloc 00:21:46.548 20:35:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:46.807 true 00:21:46.807 20:35:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:47.067 [2024-07-15 20:35:39.203156] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:47.067 [2024-07-15 20:35:39.203198] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:47.067 [2024-07-15 20:35:39.203218] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2250bd0 00:21:47.067 [2024-07-15 20:35:39.203230] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:47.067 [2024-07-15 20:35:39.204788] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:47.067 [2024-07-15 20:35:39.204816] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:47.067 BaseBdev3 00:21:47.067 20:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:47.067 20:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:47.328 BaseBdev4_malloc 00:21:47.328 20:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:47.328 true 00:21:47.587 20:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:47.587 [2024-07-15 20:35:39.937754] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:47.587 [2024-07-15 20:35:39.937801] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:47.587 [2024-07-15 20:35:39.937822] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2251aa0 00:21:47.587 [2024-07-15 20:35:39.937834] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:47.587 [2024-07-15 20:35:39.939457] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:47.587 [2024-07-15 20:35:39.939485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:47.587 BaseBdev4 00:21:47.587 20:35:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:47.846 [2024-07-15 20:35:40.182445] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:47.846 [2024-07-15 20:35:40.183863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:47.846 [2024-07-15 20:35:40.183939] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:47.846 [2024-07-15 20:35:40.184001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:47.846 [2024-07-15 20:35:40.184240] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x224bc20 00:21:47.846 [2024-07-15 20:35:40.184252] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:47.846 [2024-07-15 20:35:40.184463] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20a0260 00:21:47.846 [2024-07-15 20:35:40.184618] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x224bc20 00:21:47.846 [2024-07-15 20:35:40.184628] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x224bc20 00:21:47.846 [2024-07-15 20:35:40.184739] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.846 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:48.105 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.105 "name": "raid_bdev1", 00:21:48.105 "uuid": "c04edf60-d674-4af1-8981-f2fdf1121e87", 00:21:48.105 "strip_size_kb": 64, 00:21:48.105 "state": "online", 00:21:48.105 "raid_level": "concat", 00:21:48.105 "superblock": true, 00:21:48.105 "num_base_bdevs": 4, 00:21:48.105 "num_base_bdevs_discovered": 4, 00:21:48.105 "num_base_bdevs_operational": 4, 00:21:48.105 "base_bdevs_list": [ 00:21:48.105 { 00:21:48.105 "name": "BaseBdev1", 00:21:48.105 "uuid": "34bebfef-0fc5-5a9a-9cc0-d36b11d6a585", 00:21:48.105 "is_configured": true, 00:21:48.105 "data_offset": 2048, 00:21:48.105 "data_size": 63488 00:21:48.105 }, 00:21:48.105 { 00:21:48.105 "name": "BaseBdev2", 00:21:48.105 "uuid": "ed2935ff-4d6c-575a-a1aa-62e816a9ccf6", 00:21:48.105 "is_configured": true, 00:21:48.105 "data_offset": 2048, 00:21:48.105 "data_size": 63488 00:21:48.105 }, 00:21:48.105 { 00:21:48.105 "name": "BaseBdev3", 00:21:48.105 "uuid": "d4deb920-2652-5b7d-96ea-a437f26d6f0d", 00:21:48.105 "is_configured": true, 00:21:48.105 "data_offset": 2048, 00:21:48.105 "data_size": 63488 00:21:48.105 }, 00:21:48.105 { 00:21:48.105 "name": "BaseBdev4", 00:21:48.105 "uuid": "42ea28be-dc7c-506e-ab7a-9f4620de2aa2", 00:21:48.105 "is_configured": true, 00:21:48.105 "data_offset": 2048, 00:21:48.105 "data_size": 63488 00:21:48.105 } 00:21:48.105 ] 00:21:48.105 }' 00:21:48.105 20:35:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.105 20:35:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:49.041 20:35:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:49.041 20:35:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:49.041 [2024-07-15 20:35:41.189368] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x223dfc0 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.978 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:50.237 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.238 "name": "raid_bdev1", 00:21:50.238 "uuid": "c04edf60-d674-4af1-8981-f2fdf1121e87", 00:21:50.238 "strip_size_kb": 64, 00:21:50.238 "state": "online", 00:21:50.238 "raid_level": "concat", 00:21:50.238 "superblock": true, 00:21:50.238 "num_base_bdevs": 4, 00:21:50.238 "num_base_bdevs_discovered": 4, 00:21:50.238 "num_base_bdevs_operational": 4, 00:21:50.238 "base_bdevs_list": [ 00:21:50.238 { 00:21:50.238 "name": "BaseBdev1", 00:21:50.238 "uuid": "34bebfef-0fc5-5a9a-9cc0-d36b11d6a585", 00:21:50.238 "is_configured": true, 00:21:50.238 "data_offset": 2048, 00:21:50.238 "data_size": 63488 00:21:50.238 }, 00:21:50.238 { 00:21:50.238 "name": "BaseBdev2", 00:21:50.238 "uuid": "ed2935ff-4d6c-575a-a1aa-62e816a9ccf6", 00:21:50.238 "is_configured": true, 00:21:50.238 "data_offset": 2048, 00:21:50.238 "data_size": 63488 00:21:50.238 }, 00:21:50.238 { 00:21:50.238 "name": "BaseBdev3", 00:21:50.238 "uuid": "d4deb920-2652-5b7d-96ea-a437f26d6f0d", 00:21:50.238 "is_configured": true, 00:21:50.238 "data_offset": 2048, 00:21:50.238 "data_size": 63488 00:21:50.238 }, 00:21:50.238 { 00:21:50.238 "name": "BaseBdev4", 00:21:50.238 "uuid": "42ea28be-dc7c-506e-ab7a-9f4620de2aa2", 00:21:50.238 "is_configured": true, 00:21:50.238 "data_offset": 2048, 00:21:50.238 "data_size": 63488 00:21:50.238 } 00:21:50.238 ] 00:21:50.238 }' 00:21:50.238 20:35:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.238 20:35:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:51.175 20:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:51.433 [2024-07-15 20:35:43.726703] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:51.433 [2024-07-15 20:35:43.726740] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:51.433 [2024-07-15 20:35:43.729901] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:51.433 [2024-07-15 20:35:43.729951] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:51.433 [2024-07-15 20:35:43.729992] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:51.433 [2024-07-15 20:35:43.730003] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x224bc20 name raid_bdev1, state offline 00:21:51.433 0 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1442143 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1442143 ']' 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1442143 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1442143 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1442143' 00:21:51.433 killing process with pid 1442143 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1442143 00:21:51.433 [2024-07-15 20:35:43.799839] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:51.433 20:35:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1442143 00:21:51.692 [2024-07-15 20:35:43.836242] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:51.972 20:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.BO6Pvv5DKh 00:21:51.972 20:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:51.972 20:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:51.972 20:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:21:51.972 20:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:51.972 20:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:51.972 20:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:51.972 20:35:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:21:51.972 00:21:51.972 real 0m7.616s 00:21:51.972 user 0m12.588s 00:21:51.972 sys 0m1.421s 00:21:51.972 20:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:51.972 20:35:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:51.972 ************************************ 00:21:51.972 END TEST raid_read_error_test 00:21:51.972 ************************************ 00:21:51.972 20:35:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:51.972 20:35:44 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:21:51.972 20:35:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:51.972 20:35:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:51.972 20:35:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:51.972 ************************************ 00:21:51.972 START TEST raid_write_error_test 00:21:51.972 ************************************ 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:51.972 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.yOyoBPmFw1 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1443284 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1443284 /var/tmp/spdk-raid.sock 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1443284 ']' 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:51.973 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:51.973 20:35:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:51.973 [2024-07-15 20:35:44.257386] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:21:51.973 [2024-07-15 20:35:44.257465] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1443284 ] 00:21:52.232 [2024-07-15 20:35:44.390998] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:52.232 [2024-07-15 20:35:44.493168] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:52.232 [2024-07-15 20:35:44.554568] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:52.232 [2024-07-15 20:35:44.554603] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:52.799 20:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:52.799 20:35:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:52.799 20:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:52.799 20:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:53.058 BaseBdev1_malloc 00:21:53.058 20:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:53.317 true 00:21:53.317 20:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:53.576 [2024-07-15 20:35:45.902751] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:53.576 [2024-07-15 20:35:45.902798] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:53.576 [2024-07-15 20:35:45.902823] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16160d0 00:21:53.576 [2024-07-15 20:35:45.902835] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:53.576 [2024-07-15 20:35:45.904583] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:53.576 [2024-07-15 20:35:45.904613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:53.576 BaseBdev1 00:21:53.576 20:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:53.576 20:35:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:53.835 BaseBdev2_malloc 00:21:53.835 20:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:54.094 true 00:21:54.094 20:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:54.353 [2024-07-15 20:35:46.701506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:54.353 [2024-07-15 20:35:46.701554] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:54.353 [2024-07-15 20:35:46.701583] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x161a910 00:21:54.353 [2024-07-15 20:35:46.701596] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:54.353 [2024-07-15 20:35:46.703214] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:54.353 [2024-07-15 20:35:46.703244] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:54.353 BaseBdev2 00:21:54.353 20:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:54.353 20:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:54.612 BaseBdev3_malloc 00:21:54.612 20:35:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:54.871 true 00:21:54.871 20:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:55.130 [2024-07-15 20:35:47.448063] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:55.130 [2024-07-15 20:35:47.448113] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:55.130 [2024-07-15 20:35:47.448136] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x161cbd0 00:21:55.130 [2024-07-15 20:35:47.448149] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:55.130 [2024-07-15 20:35:47.449588] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:55.130 [2024-07-15 20:35:47.449616] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:55.130 BaseBdev3 00:21:55.130 20:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:55.130 20:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:55.389 BaseBdev4_malloc 00:21:55.389 20:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:55.647 true 00:21:55.647 20:35:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:55.906 [2024-07-15 20:35:48.194582] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:55.906 [2024-07-15 20:35:48.194627] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:55.906 [2024-07-15 20:35:48.194652] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x161daa0 00:21:55.906 [2024-07-15 20:35:48.194665] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:55.906 [2024-07-15 20:35:48.196145] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:55.906 [2024-07-15 20:35:48.196174] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:55.906 BaseBdev4 00:21:55.906 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:56.165 [2024-07-15 20:35:48.443283] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:56.165 [2024-07-15 20:35:48.444540] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:56.165 [2024-07-15 20:35:48.444609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:56.165 [2024-07-15 20:35:48.444671] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:56.165 [2024-07-15 20:35:48.444901] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1617c20 00:21:56.165 [2024-07-15 20:35:48.444912] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:56.165 [2024-07-15 20:35:48.445120] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x146c260 00:21:56.165 [2024-07-15 20:35:48.445269] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1617c20 00:21:56.165 [2024-07-15 20:35:48.445279] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1617c20 00:21:56.165 [2024-07-15 20:35:48.445384] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.165 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:56.424 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:56.424 "name": "raid_bdev1", 00:21:56.424 "uuid": "213b1d74-f49a-4161-98c2-17f70918a9bb", 00:21:56.424 "strip_size_kb": 64, 00:21:56.424 "state": "online", 00:21:56.424 "raid_level": "concat", 00:21:56.424 "superblock": true, 00:21:56.424 "num_base_bdevs": 4, 00:21:56.424 "num_base_bdevs_discovered": 4, 00:21:56.424 "num_base_bdevs_operational": 4, 00:21:56.424 "base_bdevs_list": [ 00:21:56.424 { 00:21:56.424 "name": "BaseBdev1", 00:21:56.424 "uuid": "0a000af3-cbc2-500e-b119-db3130f9bbaf", 00:21:56.424 "is_configured": true, 00:21:56.424 "data_offset": 2048, 00:21:56.424 "data_size": 63488 00:21:56.424 }, 00:21:56.424 { 00:21:56.424 "name": "BaseBdev2", 00:21:56.424 "uuid": "4bcd08d8-f53a-5f0f-ae77-112df2a7a9b2", 00:21:56.424 "is_configured": true, 00:21:56.424 "data_offset": 2048, 00:21:56.424 "data_size": 63488 00:21:56.424 }, 00:21:56.424 { 00:21:56.424 "name": "BaseBdev3", 00:21:56.424 "uuid": "ee376268-1556-5ffc-9713-5f1c03778230", 00:21:56.424 "is_configured": true, 00:21:56.424 "data_offset": 2048, 00:21:56.424 "data_size": 63488 00:21:56.424 }, 00:21:56.424 { 00:21:56.424 "name": "BaseBdev4", 00:21:56.424 "uuid": "d114184c-1360-5a80-8345-950a4a32598d", 00:21:56.424 "is_configured": true, 00:21:56.424 "data_offset": 2048, 00:21:56.424 "data_size": 63488 00:21:56.424 } 00:21:56.424 ] 00:21:56.424 }' 00:21:56.424 20:35:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:56.424 20:35:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:56.992 20:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:56.992 20:35:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:57.250 [2024-07-15 20:35:49.426187] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1609fc0 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.224 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.483 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.483 "name": "raid_bdev1", 00:21:58.483 "uuid": "213b1d74-f49a-4161-98c2-17f70918a9bb", 00:21:58.483 "strip_size_kb": 64, 00:21:58.483 "state": "online", 00:21:58.483 "raid_level": "concat", 00:21:58.483 "superblock": true, 00:21:58.483 "num_base_bdevs": 4, 00:21:58.483 "num_base_bdevs_discovered": 4, 00:21:58.483 "num_base_bdevs_operational": 4, 00:21:58.483 "base_bdevs_list": [ 00:21:58.483 { 00:21:58.483 "name": "BaseBdev1", 00:21:58.483 "uuid": "0a000af3-cbc2-500e-b119-db3130f9bbaf", 00:21:58.483 "is_configured": true, 00:21:58.483 "data_offset": 2048, 00:21:58.483 "data_size": 63488 00:21:58.483 }, 00:21:58.483 { 00:21:58.483 "name": "BaseBdev2", 00:21:58.483 "uuid": "4bcd08d8-f53a-5f0f-ae77-112df2a7a9b2", 00:21:58.483 "is_configured": true, 00:21:58.483 "data_offset": 2048, 00:21:58.483 "data_size": 63488 00:21:58.483 }, 00:21:58.483 { 00:21:58.483 "name": "BaseBdev3", 00:21:58.483 "uuid": "ee376268-1556-5ffc-9713-5f1c03778230", 00:21:58.483 "is_configured": true, 00:21:58.483 "data_offset": 2048, 00:21:58.483 "data_size": 63488 00:21:58.483 }, 00:21:58.483 { 00:21:58.483 "name": "BaseBdev4", 00:21:58.483 "uuid": "d114184c-1360-5a80-8345-950a4a32598d", 00:21:58.483 "is_configured": true, 00:21:58.483 "data_offset": 2048, 00:21:58.483 "data_size": 63488 00:21:58.483 } 00:21:58.483 ] 00:21:58.483 }' 00:21:58.483 20:35:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.483 20:35:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:59.420 [2024-07-15 20:35:51.672769] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:59.420 [2024-07-15 20:35:51.672809] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:59.420 [2024-07-15 20:35:51.675992] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:59.420 [2024-07-15 20:35:51.676030] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:59.420 [2024-07-15 20:35:51.676071] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:59.420 [2024-07-15 20:35:51.676082] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1617c20 name raid_bdev1, state offline 00:21:59.420 0 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1443284 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1443284 ']' 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1443284 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1443284 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1443284' 00:21:59.420 killing process with pid 1443284 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1443284 00:21:59.420 [2024-07-15 20:35:51.756879] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:59.420 20:35:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1443284 00:21:59.420 [2024-07-15 20:35:51.788958] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:59.679 20:35:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.yOyoBPmFw1 00:21:59.679 20:35:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:59.679 20:35:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:59.680 20:35:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:21:59.680 20:35:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:59.680 20:35:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:59.680 20:35:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:59.680 20:35:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:21:59.680 00:21:59.680 real 0m7.858s 00:21:59.680 user 0m12.650s 00:21:59.680 sys 0m1.377s 00:21:59.680 20:35:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:59.680 20:35:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.680 ************************************ 00:21:59.680 END TEST raid_write_error_test 00:21:59.680 ************************************ 00:21:59.953 20:35:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:59.953 20:35:52 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:21:59.953 20:35:52 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:21:59.953 20:35:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:59.953 20:35:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:59.953 20:35:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:59.953 ************************************ 00:21:59.953 START TEST raid_state_function_test 00:21:59.953 ************************************ 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1444398 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1444398' 00:21:59.953 Process raid pid: 1444398 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1444398 /var/tmp/spdk-raid.sock 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1444398 ']' 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:59.953 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:59.953 20:35:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.953 [2024-07-15 20:35:52.199802] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:21:59.953 [2024-07-15 20:35:52.199875] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:00.216 [2024-07-15 20:35:52.331554] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:00.216 [2024-07-15 20:35:52.439722] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:00.216 [2024-07-15 20:35:52.497493] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:00.216 [2024-07-15 20:35:52.497528] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:00.784 20:35:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:00.784 20:35:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:22:00.784 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:01.043 [2024-07-15 20:35:53.293703] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:01.043 [2024-07-15 20:35:53.293747] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:01.043 [2024-07-15 20:35:53.293758] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:01.043 [2024-07-15 20:35:53.293771] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:01.043 [2024-07-15 20:35:53.293780] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:01.043 [2024-07-15 20:35:53.293791] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:01.043 [2024-07-15 20:35:53.293800] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:01.043 [2024-07-15 20:35:53.293811] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.043 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:01.301 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.301 "name": "Existed_Raid", 00:22:01.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.301 "strip_size_kb": 0, 00:22:01.301 "state": "configuring", 00:22:01.301 "raid_level": "raid1", 00:22:01.301 "superblock": false, 00:22:01.301 "num_base_bdevs": 4, 00:22:01.301 "num_base_bdevs_discovered": 0, 00:22:01.301 "num_base_bdevs_operational": 4, 00:22:01.301 "base_bdevs_list": [ 00:22:01.301 { 00:22:01.301 "name": "BaseBdev1", 00:22:01.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.301 "is_configured": false, 00:22:01.301 "data_offset": 0, 00:22:01.301 "data_size": 0 00:22:01.301 }, 00:22:01.301 { 00:22:01.301 "name": "BaseBdev2", 00:22:01.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.301 "is_configured": false, 00:22:01.301 "data_offset": 0, 00:22:01.301 "data_size": 0 00:22:01.301 }, 00:22:01.301 { 00:22:01.301 "name": "BaseBdev3", 00:22:01.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.301 "is_configured": false, 00:22:01.301 "data_offset": 0, 00:22:01.301 "data_size": 0 00:22:01.301 }, 00:22:01.301 { 00:22:01.301 "name": "BaseBdev4", 00:22:01.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.301 "is_configured": false, 00:22:01.301 "data_offset": 0, 00:22:01.301 "data_size": 0 00:22:01.301 } 00:22:01.301 ] 00:22:01.301 }' 00:22:01.301 20:35:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.301 20:35:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.866 20:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:02.124 [2024-07-15 20:35:54.380429] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:02.124 [2024-07-15 20:35:54.380464] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0aaa0 name Existed_Raid, state configuring 00:22:02.124 20:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:02.381 [2024-07-15 20:35:54.629107] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:02.381 [2024-07-15 20:35:54.629144] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:02.381 [2024-07-15 20:35:54.629154] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:02.381 [2024-07-15 20:35:54.629166] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:02.381 [2024-07-15 20:35:54.629175] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:02.381 [2024-07-15 20:35:54.629186] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:02.381 [2024-07-15 20:35:54.629195] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:02.381 [2024-07-15 20:35:54.629206] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:02.381 20:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:02.653 [2024-07-15 20:35:54.887768] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:02.653 BaseBdev1 00:22:02.653 20:35:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:02.653 20:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:02.653 20:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:02.653 20:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:02.653 20:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:02.653 20:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:02.653 20:35:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:02.910 20:35:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:03.169 [ 00:22:03.169 { 00:22:03.169 "name": "BaseBdev1", 00:22:03.169 "aliases": [ 00:22:03.169 "7cd33056-bf79-416f-923d-82527c0dab97" 00:22:03.169 ], 00:22:03.169 "product_name": "Malloc disk", 00:22:03.169 "block_size": 512, 00:22:03.169 "num_blocks": 65536, 00:22:03.169 "uuid": "7cd33056-bf79-416f-923d-82527c0dab97", 00:22:03.169 "assigned_rate_limits": { 00:22:03.169 "rw_ios_per_sec": 0, 00:22:03.169 "rw_mbytes_per_sec": 0, 00:22:03.169 "r_mbytes_per_sec": 0, 00:22:03.169 "w_mbytes_per_sec": 0 00:22:03.169 }, 00:22:03.169 "claimed": true, 00:22:03.169 "claim_type": "exclusive_write", 00:22:03.169 "zoned": false, 00:22:03.169 "supported_io_types": { 00:22:03.169 "read": true, 00:22:03.169 "write": true, 00:22:03.169 "unmap": true, 00:22:03.169 "flush": true, 00:22:03.169 "reset": true, 00:22:03.169 "nvme_admin": false, 00:22:03.169 "nvme_io": false, 00:22:03.169 "nvme_io_md": false, 00:22:03.169 "write_zeroes": true, 00:22:03.169 "zcopy": true, 00:22:03.169 "get_zone_info": false, 00:22:03.169 "zone_management": false, 00:22:03.169 "zone_append": false, 00:22:03.169 "compare": false, 00:22:03.169 "compare_and_write": false, 00:22:03.169 "abort": true, 00:22:03.169 "seek_hole": false, 00:22:03.169 "seek_data": false, 00:22:03.169 "copy": true, 00:22:03.169 "nvme_iov_md": false 00:22:03.169 }, 00:22:03.169 "memory_domains": [ 00:22:03.169 { 00:22:03.169 "dma_device_id": "system", 00:22:03.169 "dma_device_type": 1 00:22:03.169 }, 00:22:03.169 { 00:22:03.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.169 "dma_device_type": 2 00:22:03.169 } 00:22:03.169 ], 00:22:03.169 "driver_specific": {} 00:22:03.169 } 00:22:03.169 ] 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.169 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:03.427 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:03.427 "name": "Existed_Raid", 00:22:03.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:03.427 "strip_size_kb": 0, 00:22:03.427 "state": "configuring", 00:22:03.427 "raid_level": "raid1", 00:22:03.427 "superblock": false, 00:22:03.427 "num_base_bdevs": 4, 00:22:03.427 "num_base_bdevs_discovered": 1, 00:22:03.427 "num_base_bdevs_operational": 4, 00:22:03.427 "base_bdevs_list": [ 00:22:03.427 { 00:22:03.427 "name": "BaseBdev1", 00:22:03.427 "uuid": "7cd33056-bf79-416f-923d-82527c0dab97", 00:22:03.427 "is_configured": true, 00:22:03.427 "data_offset": 0, 00:22:03.427 "data_size": 65536 00:22:03.427 }, 00:22:03.427 { 00:22:03.427 "name": "BaseBdev2", 00:22:03.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:03.427 "is_configured": false, 00:22:03.427 "data_offset": 0, 00:22:03.427 "data_size": 0 00:22:03.427 }, 00:22:03.427 { 00:22:03.427 "name": "BaseBdev3", 00:22:03.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:03.427 "is_configured": false, 00:22:03.427 "data_offset": 0, 00:22:03.427 "data_size": 0 00:22:03.427 }, 00:22:03.427 { 00:22:03.427 "name": "BaseBdev4", 00:22:03.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:03.427 "is_configured": false, 00:22:03.427 "data_offset": 0, 00:22:03.427 "data_size": 0 00:22:03.427 } 00:22:03.427 ] 00:22:03.427 }' 00:22:03.427 20:35:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:03.427 20:35:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:03.992 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:04.251 [2024-07-15 20:35:56.471986] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:04.251 [2024-07-15 20:35:56.472032] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0a310 name Existed_Raid, state configuring 00:22:04.251 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:04.509 [2024-07-15 20:35:56.712646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:04.509 [2024-07-15 20:35:56.714117] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:04.509 [2024-07-15 20:35:56.714154] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:04.509 [2024-07-15 20:35:56.714165] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:04.509 [2024-07-15 20:35:56.714177] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:04.509 [2024-07-15 20:35:56.714186] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:04.509 [2024-07-15 20:35:56.714198] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:04.509 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.769 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.769 "name": "Existed_Raid", 00:22:04.769 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.769 "strip_size_kb": 0, 00:22:04.769 "state": "configuring", 00:22:04.769 "raid_level": "raid1", 00:22:04.769 "superblock": false, 00:22:04.769 "num_base_bdevs": 4, 00:22:04.769 "num_base_bdevs_discovered": 1, 00:22:04.769 "num_base_bdevs_operational": 4, 00:22:04.769 "base_bdevs_list": [ 00:22:04.769 { 00:22:04.769 "name": "BaseBdev1", 00:22:04.769 "uuid": "7cd33056-bf79-416f-923d-82527c0dab97", 00:22:04.769 "is_configured": true, 00:22:04.769 "data_offset": 0, 00:22:04.769 "data_size": 65536 00:22:04.769 }, 00:22:04.769 { 00:22:04.769 "name": "BaseBdev2", 00:22:04.769 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.769 "is_configured": false, 00:22:04.769 "data_offset": 0, 00:22:04.769 "data_size": 0 00:22:04.769 }, 00:22:04.769 { 00:22:04.769 "name": "BaseBdev3", 00:22:04.769 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.769 "is_configured": false, 00:22:04.769 "data_offset": 0, 00:22:04.769 "data_size": 0 00:22:04.769 }, 00:22:04.769 { 00:22:04.769 "name": "BaseBdev4", 00:22:04.769 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.769 "is_configured": false, 00:22:04.769 "data_offset": 0, 00:22:04.769 "data_size": 0 00:22:04.769 } 00:22:04.769 ] 00:22:04.769 }' 00:22:04.769 20:35:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.769 20:35:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:05.338 20:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:05.596 [2024-07-15 20:35:57.879205] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:05.596 BaseBdev2 00:22:05.596 20:35:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:05.596 20:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:05.596 20:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:05.596 20:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:05.596 20:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:05.596 20:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:05.597 20:35:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:05.855 20:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:06.113 [ 00:22:06.113 { 00:22:06.113 "name": "BaseBdev2", 00:22:06.113 "aliases": [ 00:22:06.113 "08173bd5-f052-41ef-b69b-f526b90f41d7" 00:22:06.113 ], 00:22:06.113 "product_name": "Malloc disk", 00:22:06.113 "block_size": 512, 00:22:06.113 "num_blocks": 65536, 00:22:06.113 "uuid": "08173bd5-f052-41ef-b69b-f526b90f41d7", 00:22:06.113 "assigned_rate_limits": { 00:22:06.113 "rw_ios_per_sec": 0, 00:22:06.113 "rw_mbytes_per_sec": 0, 00:22:06.113 "r_mbytes_per_sec": 0, 00:22:06.113 "w_mbytes_per_sec": 0 00:22:06.113 }, 00:22:06.113 "claimed": true, 00:22:06.113 "claim_type": "exclusive_write", 00:22:06.113 "zoned": false, 00:22:06.113 "supported_io_types": { 00:22:06.113 "read": true, 00:22:06.113 "write": true, 00:22:06.113 "unmap": true, 00:22:06.113 "flush": true, 00:22:06.113 "reset": true, 00:22:06.113 "nvme_admin": false, 00:22:06.113 "nvme_io": false, 00:22:06.113 "nvme_io_md": false, 00:22:06.113 "write_zeroes": true, 00:22:06.113 "zcopy": true, 00:22:06.113 "get_zone_info": false, 00:22:06.113 "zone_management": false, 00:22:06.113 "zone_append": false, 00:22:06.113 "compare": false, 00:22:06.113 "compare_and_write": false, 00:22:06.113 "abort": true, 00:22:06.113 "seek_hole": false, 00:22:06.113 "seek_data": false, 00:22:06.113 "copy": true, 00:22:06.113 "nvme_iov_md": false 00:22:06.113 }, 00:22:06.113 "memory_domains": [ 00:22:06.113 { 00:22:06.113 "dma_device_id": "system", 00:22:06.113 "dma_device_type": 1 00:22:06.113 }, 00:22:06.113 { 00:22:06.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:06.113 "dma_device_type": 2 00:22:06.113 } 00:22:06.113 ], 00:22:06.113 "driver_specific": {} 00:22:06.113 } 00:22:06.113 ] 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:06.113 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.372 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.372 "name": "Existed_Raid", 00:22:06.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.372 "strip_size_kb": 0, 00:22:06.372 "state": "configuring", 00:22:06.372 "raid_level": "raid1", 00:22:06.372 "superblock": false, 00:22:06.372 "num_base_bdevs": 4, 00:22:06.372 "num_base_bdevs_discovered": 2, 00:22:06.372 "num_base_bdevs_operational": 4, 00:22:06.372 "base_bdevs_list": [ 00:22:06.372 { 00:22:06.372 "name": "BaseBdev1", 00:22:06.372 "uuid": "7cd33056-bf79-416f-923d-82527c0dab97", 00:22:06.372 "is_configured": true, 00:22:06.372 "data_offset": 0, 00:22:06.372 "data_size": 65536 00:22:06.372 }, 00:22:06.372 { 00:22:06.372 "name": "BaseBdev2", 00:22:06.372 "uuid": "08173bd5-f052-41ef-b69b-f526b90f41d7", 00:22:06.372 "is_configured": true, 00:22:06.372 "data_offset": 0, 00:22:06.372 "data_size": 65536 00:22:06.372 }, 00:22:06.372 { 00:22:06.372 "name": "BaseBdev3", 00:22:06.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.372 "is_configured": false, 00:22:06.372 "data_offset": 0, 00:22:06.372 "data_size": 0 00:22:06.372 }, 00:22:06.372 { 00:22:06.372 "name": "BaseBdev4", 00:22:06.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.372 "is_configured": false, 00:22:06.372 "data_offset": 0, 00:22:06.372 "data_size": 0 00:22:06.372 } 00:22:06.372 ] 00:22:06.372 }' 00:22:06.372 20:35:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.372 20:35:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:06.940 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:07.199 [2024-07-15 20:35:59.410924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:07.199 BaseBdev3 00:22:07.199 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:07.199 20:35:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:07.199 20:35:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:07.199 20:35:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:07.199 20:35:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:07.199 20:35:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:07.199 20:35:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:07.457 20:35:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:07.716 [ 00:22:07.716 { 00:22:07.716 "name": "BaseBdev3", 00:22:07.716 "aliases": [ 00:22:07.716 "89d7d1b9-07d9-400d-80b3-e9e1fae1a2fb" 00:22:07.716 ], 00:22:07.716 "product_name": "Malloc disk", 00:22:07.716 "block_size": 512, 00:22:07.716 "num_blocks": 65536, 00:22:07.716 "uuid": "89d7d1b9-07d9-400d-80b3-e9e1fae1a2fb", 00:22:07.716 "assigned_rate_limits": { 00:22:07.716 "rw_ios_per_sec": 0, 00:22:07.716 "rw_mbytes_per_sec": 0, 00:22:07.716 "r_mbytes_per_sec": 0, 00:22:07.716 "w_mbytes_per_sec": 0 00:22:07.716 }, 00:22:07.716 "claimed": true, 00:22:07.716 "claim_type": "exclusive_write", 00:22:07.716 "zoned": false, 00:22:07.716 "supported_io_types": { 00:22:07.716 "read": true, 00:22:07.716 "write": true, 00:22:07.716 "unmap": true, 00:22:07.716 "flush": true, 00:22:07.716 "reset": true, 00:22:07.716 "nvme_admin": false, 00:22:07.716 "nvme_io": false, 00:22:07.716 "nvme_io_md": false, 00:22:07.716 "write_zeroes": true, 00:22:07.716 "zcopy": true, 00:22:07.716 "get_zone_info": false, 00:22:07.716 "zone_management": false, 00:22:07.716 "zone_append": false, 00:22:07.716 "compare": false, 00:22:07.716 "compare_and_write": false, 00:22:07.716 "abort": true, 00:22:07.716 "seek_hole": false, 00:22:07.716 "seek_data": false, 00:22:07.716 "copy": true, 00:22:07.716 "nvme_iov_md": false 00:22:07.716 }, 00:22:07.716 "memory_domains": [ 00:22:07.716 { 00:22:07.716 "dma_device_id": "system", 00:22:07.716 "dma_device_type": 1 00:22:07.716 }, 00:22:07.716 { 00:22:07.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:07.716 "dma_device_type": 2 00:22:07.716 } 00:22:07.716 ], 00:22:07.716 "driver_specific": {} 00:22:07.716 } 00:22:07.716 ] 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.716 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.717 20:35:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:07.975 20:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.975 "name": "Existed_Raid", 00:22:07.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.975 "strip_size_kb": 0, 00:22:07.975 "state": "configuring", 00:22:07.975 "raid_level": "raid1", 00:22:07.975 "superblock": false, 00:22:07.975 "num_base_bdevs": 4, 00:22:07.976 "num_base_bdevs_discovered": 3, 00:22:07.976 "num_base_bdevs_operational": 4, 00:22:07.976 "base_bdevs_list": [ 00:22:07.976 { 00:22:07.976 "name": "BaseBdev1", 00:22:07.976 "uuid": "7cd33056-bf79-416f-923d-82527c0dab97", 00:22:07.976 "is_configured": true, 00:22:07.976 "data_offset": 0, 00:22:07.976 "data_size": 65536 00:22:07.976 }, 00:22:07.976 { 00:22:07.976 "name": "BaseBdev2", 00:22:07.976 "uuid": "08173bd5-f052-41ef-b69b-f526b90f41d7", 00:22:07.976 "is_configured": true, 00:22:07.976 "data_offset": 0, 00:22:07.976 "data_size": 65536 00:22:07.976 }, 00:22:07.976 { 00:22:07.976 "name": "BaseBdev3", 00:22:07.976 "uuid": "89d7d1b9-07d9-400d-80b3-e9e1fae1a2fb", 00:22:07.976 "is_configured": true, 00:22:07.976 "data_offset": 0, 00:22:07.976 "data_size": 65536 00:22:07.976 }, 00:22:07.976 { 00:22:07.976 "name": "BaseBdev4", 00:22:07.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.976 "is_configured": false, 00:22:07.976 "data_offset": 0, 00:22:07.976 "data_size": 0 00:22:07.976 } 00:22:07.976 ] 00:22:07.976 }' 00:22:07.976 20:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.976 20:36:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:08.543 20:36:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:08.802 [2024-07-15 20:36:01.054702] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:08.802 [2024-07-15 20:36:01.054756] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f0b350 00:22:08.802 [2024-07-15 20:36:01.054765] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:08.802 [2024-07-15 20:36:01.055026] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f0b020 00:22:08.802 [2024-07-15 20:36:01.055156] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f0b350 00:22:08.802 [2024-07-15 20:36:01.055171] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f0b350 00:22:08.802 [2024-07-15 20:36:01.055342] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:08.802 BaseBdev4 00:22:08.802 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:08.802 20:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:08.802 20:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:08.802 20:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:08.802 20:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:08.802 20:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:08.802 20:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:09.369 20:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:09.627 [ 00:22:09.627 { 00:22:09.627 "name": "BaseBdev4", 00:22:09.627 "aliases": [ 00:22:09.627 "7f5a9615-2270-4fe2-8c03-6f0eb9bd024d" 00:22:09.627 ], 00:22:09.627 "product_name": "Malloc disk", 00:22:09.627 "block_size": 512, 00:22:09.627 "num_blocks": 65536, 00:22:09.627 "uuid": "7f5a9615-2270-4fe2-8c03-6f0eb9bd024d", 00:22:09.627 "assigned_rate_limits": { 00:22:09.627 "rw_ios_per_sec": 0, 00:22:09.627 "rw_mbytes_per_sec": 0, 00:22:09.627 "r_mbytes_per_sec": 0, 00:22:09.627 "w_mbytes_per_sec": 0 00:22:09.627 }, 00:22:09.627 "claimed": true, 00:22:09.627 "claim_type": "exclusive_write", 00:22:09.627 "zoned": false, 00:22:09.627 "supported_io_types": { 00:22:09.627 "read": true, 00:22:09.627 "write": true, 00:22:09.627 "unmap": true, 00:22:09.627 "flush": true, 00:22:09.627 "reset": true, 00:22:09.627 "nvme_admin": false, 00:22:09.627 "nvme_io": false, 00:22:09.627 "nvme_io_md": false, 00:22:09.627 "write_zeroes": true, 00:22:09.627 "zcopy": true, 00:22:09.627 "get_zone_info": false, 00:22:09.627 "zone_management": false, 00:22:09.627 "zone_append": false, 00:22:09.627 "compare": false, 00:22:09.627 "compare_and_write": false, 00:22:09.627 "abort": true, 00:22:09.627 "seek_hole": false, 00:22:09.627 "seek_data": false, 00:22:09.627 "copy": true, 00:22:09.627 "nvme_iov_md": false 00:22:09.627 }, 00:22:09.627 "memory_domains": [ 00:22:09.627 { 00:22:09.627 "dma_device_id": "system", 00:22:09.627 "dma_device_type": 1 00:22:09.627 }, 00:22:09.627 { 00:22:09.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:09.627 "dma_device_type": 2 00:22:09.627 } 00:22:09.627 ], 00:22:09.627 "driver_specific": {} 00:22:09.627 } 00:22:09.627 ] 00:22:09.627 20:36:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:09.627 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:09.627 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.628 20:36:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:09.886 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:09.886 "name": "Existed_Raid", 00:22:09.886 "uuid": "3c40e5d8-64c0-4049-a84f-97a120456a3a", 00:22:09.886 "strip_size_kb": 0, 00:22:09.886 "state": "online", 00:22:09.886 "raid_level": "raid1", 00:22:09.886 "superblock": false, 00:22:09.886 "num_base_bdevs": 4, 00:22:09.886 "num_base_bdevs_discovered": 4, 00:22:09.886 "num_base_bdevs_operational": 4, 00:22:09.886 "base_bdevs_list": [ 00:22:09.886 { 00:22:09.886 "name": "BaseBdev1", 00:22:09.886 "uuid": "7cd33056-bf79-416f-923d-82527c0dab97", 00:22:09.886 "is_configured": true, 00:22:09.886 "data_offset": 0, 00:22:09.886 "data_size": 65536 00:22:09.886 }, 00:22:09.886 { 00:22:09.886 "name": "BaseBdev2", 00:22:09.886 "uuid": "08173bd5-f052-41ef-b69b-f526b90f41d7", 00:22:09.886 "is_configured": true, 00:22:09.886 "data_offset": 0, 00:22:09.886 "data_size": 65536 00:22:09.886 }, 00:22:09.886 { 00:22:09.886 "name": "BaseBdev3", 00:22:09.886 "uuid": "89d7d1b9-07d9-400d-80b3-e9e1fae1a2fb", 00:22:09.886 "is_configured": true, 00:22:09.886 "data_offset": 0, 00:22:09.886 "data_size": 65536 00:22:09.886 }, 00:22:09.886 { 00:22:09.886 "name": "BaseBdev4", 00:22:09.886 "uuid": "7f5a9615-2270-4fe2-8c03-6f0eb9bd024d", 00:22:09.886 "is_configured": true, 00:22:09.886 "data_offset": 0, 00:22:09.886 "data_size": 65536 00:22:09.886 } 00:22:09.886 ] 00:22:09.886 }' 00:22:09.886 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:09.886 20:36:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.452 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:10.452 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:10.452 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:10.452 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:10.452 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:10.452 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:10.452 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:10.452 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:10.711 [2024-07-15 20:36:02.911979] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:10.711 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:10.711 "name": "Existed_Raid", 00:22:10.711 "aliases": [ 00:22:10.711 "3c40e5d8-64c0-4049-a84f-97a120456a3a" 00:22:10.711 ], 00:22:10.711 "product_name": "Raid Volume", 00:22:10.711 "block_size": 512, 00:22:10.711 "num_blocks": 65536, 00:22:10.711 "uuid": "3c40e5d8-64c0-4049-a84f-97a120456a3a", 00:22:10.711 "assigned_rate_limits": { 00:22:10.711 "rw_ios_per_sec": 0, 00:22:10.711 "rw_mbytes_per_sec": 0, 00:22:10.711 "r_mbytes_per_sec": 0, 00:22:10.711 "w_mbytes_per_sec": 0 00:22:10.711 }, 00:22:10.711 "claimed": false, 00:22:10.711 "zoned": false, 00:22:10.711 "supported_io_types": { 00:22:10.711 "read": true, 00:22:10.711 "write": true, 00:22:10.711 "unmap": false, 00:22:10.711 "flush": false, 00:22:10.711 "reset": true, 00:22:10.711 "nvme_admin": false, 00:22:10.711 "nvme_io": false, 00:22:10.711 "nvme_io_md": false, 00:22:10.711 "write_zeroes": true, 00:22:10.711 "zcopy": false, 00:22:10.711 "get_zone_info": false, 00:22:10.711 "zone_management": false, 00:22:10.711 "zone_append": false, 00:22:10.711 "compare": false, 00:22:10.711 "compare_and_write": false, 00:22:10.711 "abort": false, 00:22:10.711 "seek_hole": false, 00:22:10.711 "seek_data": false, 00:22:10.711 "copy": false, 00:22:10.711 "nvme_iov_md": false 00:22:10.711 }, 00:22:10.711 "memory_domains": [ 00:22:10.711 { 00:22:10.711 "dma_device_id": "system", 00:22:10.711 "dma_device_type": 1 00:22:10.711 }, 00:22:10.711 { 00:22:10.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.711 "dma_device_type": 2 00:22:10.711 }, 00:22:10.711 { 00:22:10.711 "dma_device_id": "system", 00:22:10.711 "dma_device_type": 1 00:22:10.711 }, 00:22:10.711 { 00:22:10.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.712 "dma_device_type": 2 00:22:10.712 }, 00:22:10.712 { 00:22:10.712 "dma_device_id": "system", 00:22:10.712 "dma_device_type": 1 00:22:10.712 }, 00:22:10.712 { 00:22:10.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.712 "dma_device_type": 2 00:22:10.712 }, 00:22:10.712 { 00:22:10.712 "dma_device_id": "system", 00:22:10.712 "dma_device_type": 1 00:22:10.712 }, 00:22:10.712 { 00:22:10.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.712 "dma_device_type": 2 00:22:10.712 } 00:22:10.712 ], 00:22:10.712 "driver_specific": { 00:22:10.712 "raid": { 00:22:10.712 "uuid": "3c40e5d8-64c0-4049-a84f-97a120456a3a", 00:22:10.712 "strip_size_kb": 0, 00:22:10.712 "state": "online", 00:22:10.712 "raid_level": "raid1", 00:22:10.712 "superblock": false, 00:22:10.712 "num_base_bdevs": 4, 00:22:10.712 "num_base_bdevs_discovered": 4, 00:22:10.712 "num_base_bdevs_operational": 4, 00:22:10.712 "base_bdevs_list": [ 00:22:10.712 { 00:22:10.712 "name": "BaseBdev1", 00:22:10.712 "uuid": "7cd33056-bf79-416f-923d-82527c0dab97", 00:22:10.712 "is_configured": true, 00:22:10.712 "data_offset": 0, 00:22:10.712 "data_size": 65536 00:22:10.712 }, 00:22:10.712 { 00:22:10.712 "name": "BaseBdev2", 00:22:10.712 "uuid": "08173bd5-f052-41ef-b69b-f526b90f41d7", 00:22:10.712 "is_configured": true, 00:22:10.712 "data_offset": 0, 00:22:10.712 "data_size": 65536 00:22:10.712 }, 00:22:10.712 { 00:22:10.712 "name": "BaseBdev3", 00:22:10.712 "uuid": "89d7d1b9-07d9-400d-80b3-e9e1fae1a2fb", 00:22:10.712 "is_configured": true, 00:22:10.712 "data_offset": 0, 00:22:10.712 "data_size": 65536 00:22:10.712 }, 00:22:10.712 { 00:22:10.712 "name": "BaseBdev4", 00:22:10.712 "uuid": "7f5a9615-2270-4fe2-8c03-6f0eb9bd024d", 00:22:10.712 "is_configured": true, 00:22:10.712 "data_offset": 0, 00:22:10.712 "data_size": 65536 00:22:10.712 } 00:22:10.712 ] 00:22:10.712 } 00:22:10.712 } 00:22:10.712 }' 00:22:10.712 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:10.712 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:10.712 BaseBdev2 00:22:10.712 BaseBdev3 00:22:10.712 BaseBdev4' 00:22:10.712 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:10.712 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:10.712 20:36:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:10.971 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:10.971 "name": "BaseBdev1", 00:22:10.971 "aliases": [ 00:22:10.971 "7cd33056-bf79-416f-923d-82527c0dab97" 00:22:10.971 ], 00:22:10.971 "product_name": "Malloc disk", 00:22:10.971 "block_size": 512, 00:22:10.971 "num_blocks": 65536, 00:22:10.971 "uuid": "7cd33056-bf79-416f-923d-82527c0dab97", 00:22:10.971 "assigned_rate_limits": { 00:22:10.971 "rw_ios_per_sec": 0, 00:22:10.971 "rw_mbytes_per_sec": 0, 00:22:10.971 "r_mbytes_per_sec": 0, 00:22:10.971 "w_mbytes_per_sec": 0 00:22:10.971 }, 00:22:10.971 "claimed": true, 00:22:10.971 "claim_type": "exclusive_write", 00:22:10.971 "zoned": false, 00:22:10.971 "supported_io_types": { 00:22:10.971 "read": true, 00:22:10.971 "write": true, 00:22:10.971 "unmap": true, 00:22:10.971 "flush": true, 00:22:10.971 "reset": true, 00:22:10.971 "nvme_admin": false, 00:22:10.971 "nvme_io": false, 00:22:10.971 "nvme_io_md": false, 00:22:10.971 "write_zeroes": true, 00:22:10.971 "zcopy": true, 00:22:10.971 "get_zone_info": false, 00:22:10.971 "zone_management": false, 00:22:10.971 "zone_append": false, 00:22:10.971 "compare": false, 00:22:10.971 "compare_and_write": false, 00:22:10.971 "abort": true, 00:22:10.971 "seek_hole": false, 00:22:10.971 "seek_data": false, 00:22:10.971 "copy": true, 00:22:10.971 "nvme_iov_md": false 00:22:10.971 }, 00:22:10.971 "memory_domains": [ 00:22:10.971 { 00:22:10.971 "dma_device_id": "system", 00:22:10.971 "dma_device_type": 1 00:22:10.971 }, 00:22:10.971 { 00:22:10.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.971 "dma_device_type": 2 00:22:10.971 } 00:22:10.971 ], 00:22:10.971 "driver_specific": {} 00:22:10.971 }' 00:22:10.971 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:10.971 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:10.971 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:10.971 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:11.230 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:11.488 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:11.488 "name": "BaseBdev2", 00:22:11.488 "aliases": [ 00:22:11.488 "08173bd5-f052-41ef-b69b-f526b90f41d7" 00:22:11.488 ], 00:22:11.488 "product_name": "Malloc disk", 00:22:11.488 "block_size": 512, 00:22:11.488 "num_blocks": 65536, 00:22:11.488 "uuid": "08173bd5-f052-41ef-b69b-f526b90f41d7", 00:22:11.488 "assigned_rate_limits": { 00:22:11.488 "rw_ios_per_sec": 0, 00:22:11.488 "rw_mbytes_per_sec": 0, 00:22:11.488 "r_mbytes_per_sec": 0, 00:22:11.488 "w_mbytes_per_sec": 0 00:22:11.488 }, 00:22:11.488 "claimed": true, 00:22:11.488 "claim_type": "exclusive_write", 00:22:11.488 "zoned": false, 00:22:11.488 "supported_io_types": { 00:22:11.488 "read": true, 00:22:11.488 "write": true, 00:22:11.488 "unmap": true, 00:22:11.488 "flush": true, 00:22:11.488 "reset": true, 00:22:11.488 "nvme_admin": false, 00:22:11.488 "nvme_io": false, 00:22:11.488 "nvme_io_md": false, 00:22:11.488 "write_zeroes": true, 00:22:11.488 "zcopy": true, 00:22:11.488 "get_zone_info": false, 00:22:11.488 "zone_management": false, 00:22:11.488 "zone_append": false, 00:22:11.488 "compare": false, 00:22:11.488 "compare_and_write": false, 00:22:11.488 "abort": true, 00:22:11.488 "seek_hole": false, 00:22:11.488 "seek_data": false, 00:22:11.488 "copy": true, 00:22:11.488 "nvme_iov_md": false 00:22:11.488 }, 00:22:11.488 "memory_domains": [ 00:22:11.488 { 00:22:11.488 "dma_device_id": "system", 00:22:11.488 "dma_device_type": 1 00:22:11.488 }, 00:22:11.488 { 00:22:11.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.488 "dma_device_type": 2 00:22:11.488 } 00:22:11.488 ], 00:22:11.488 "driver_specific": {} 00:22:11.488 }' 00:22:11.488 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.744 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:11.744 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:11.744 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.744 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:11.744 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:11.744 20:36:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.744 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:11.744 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:11.744 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.007 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.007 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:12.007 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:12.007 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:12.007 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:12.272 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:12.272 "name": "BaseBdev3", 00:22:12.272 "aliases": [ 00:22:12.272 "89d7d1b9-07d9-400d-80b3-e9e1fae1a2fb" 00:22:12.272 ], 00:22:12.272 "product_name": "Malloc disk", 00:22:12.272 "block_size": 512, 00:22:12.272 "num_blocks": 65536, 00:22:12.272 "uuid": "89d7d1b9-07d9-400d-80b3-e9e1fae1a2fb", 00:22:12.272 "assigned_rate_limits": { 00:22:12.272 "rw_ios_per_sec": 0, 00:22:12.272 "rw_mbytes_per_sec": 0, 00:22:12.272 "r_mbytes_per_sec": 0, 00:22:12.272 "w_mbytes_per_sec": 0 00:22:12.272 }, 00:22:12.272 "claimed": true, 00:22:12.272 "claim_type": "exclusive_write", 00:22:12.272 "zoned": false, 00:22:12.272 "supported_io_types": { 00:22:12.272 "read": true, 00:22:12.272 "write": true, 00:22:12.272 "unmap": true, 00:22:12.272 "flush": true, 00:22:12.272 "reset": true, 00:22:12.272 "nvme_admin": false, 00:22:12.272 "nvme_io": false, 00:22:12.272 "nvme_io_md": false, 00:22:12.272 "write_zeroes": true, 00:22:12.272 "zcopy": true, 00:22:12.272 "get_zone_info": false, 00:22:12.272 "zone_management": false, 00:22:12.272 "zone_append": false, 00:22:12.272 "compare": false, 00:22:12.272 "compare_and_write": false, 00:22:12.272 "abort": true, 00:22:12.272 "seek_hole": false, 00:22:12.272 "seek_data": false, 00:22:12.272 "copy": true, 00:22:12.272 "nvme_iov_md": false 00:22:12.272 }, 00:22:12.272 "memory_domains": [ 00:22:12.272 { 00:22:12.272 "dma_device_id": "system", 00:22:12.272 "dma_device_type": 1 00:22:12.272 }, 00:22:12.272 { 00:22:12.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.272 "dma_device_type": 2 00:22:12.272 } 00:22:12.272 ], 00:22:12.272 "driver_specific": {} 00:22:12.272 }' 00:22:12.272 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.272 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.272 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:12.272 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.272 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.272 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:12.272 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.530 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:12.530 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:12.530 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.530 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:12.530 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:12.530 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:12.530 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:12.530 20:36:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:12.788 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:12.788 "name": "BaseBdev4", 00:22:12.788 "aliases": [ 00:22:12.788 "7f5a9615-2270-4fe2-8c03-6f0eb9bd024d" 00:22:12.788 ], 00:22:12.788 "product_name": "Malloc disk", 00:22:12.788 "block_size": 512, 00:22:12.788 "num_blocks": 65536, 00:22:12.788 "uuid": "7f5a9615-2270-4fe2-8c03-6f0eb9bd024d", 00:22:12.788 "assigned_rate_limits": { 00:22:12.788 "rw_ios_per_sec": 0, 00:22:12.788 "rw_mbytes_per_sec": 0, 00:22:12.788 "r_mbytes_per_sec": 0, 00:22:12.788 "w_mbytes_per_sec": 0 00:22:12.788 }, 00:22:12.788 "claimed": true, 00:22:12.788 "claim_type": "exclusive_write", 00:22:12.788 "zoned": false, 00:22:12.788 "supported_io_types": { 00:22:12.788 "read": true, 00:22:12.788 "write": true, 00:22:12.788 "unmap": true, 00:22:12.788 "flush": true, 00:22:12.788 "reset": true, 00:22:12.788 "nvme_admin": false, 00:22:12.788 "nvme_io": false, 00:22:12.788 "nvme_io_md": false, 00:22:12.788 "write_zeroes": true, 00:22:12.788 "zcopy": true, 00:22:12.788 "get_zone_info": false, 00:22:12.788 "zone_management": false, 00:22:12.788 "zone_append": false, 00:22:12.789 "compare": false, 00:22:12.789 "compare_and_write": false, 00:22:12.789 "abort": true, 00:22:12.789 "seek_hole": false, 00:22:12.789 "seek_data": false, 00:22:12.789 "copy": true, 00:22:12.789 "nvme_iov_md": false 00:22:12.789 }, 00:22:12.789 "memory_domains": [ 00:22:12.789 { 00:22:12.789 "dma_device_id": "system", 00:22:12.789 "dma_device_type": 1 00:22:12.789 }, 00:22:12.789 { 00:22:12.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.789 "dma_device_type": 2 00:22:12.789 } 00:22:12.789 ], 00:22:12.789 "driver_specific": {} 00:22:12.789 }' 00:22:12.789 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.789 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.789 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:12.789 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.789 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:13.047 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:13.047 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.047 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.047 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:13.047 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.047 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.047 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:13.047 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:13.305 [2024-07-15 20:36:05.610972] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.305 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:13.563 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:13.563 "name": "Existed_Raid", 00:22:13.563 "uuid": "3c40e5d8-64c0-4049-a84f-97a120456a3a", 00:22:13.563 "strip_size_kb": 0, 00:22:13.563 "state": "online", 00:22:13.563 "raid_level": "raid1", 00:22:13.563 "superblock": false, 00:22:13.563 "num_base_bdevs": 4, 00:22:13.563 "num_base_bdevs_discovered": 3, 00:22:13.563 "num_base_bdevs_operational": 3, 00:22:13.563 "base_bdevs_list": [ 00:22:13.563 { 00:22:13.563 "name": null, 00:22:13.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.563 "is_configured": false, 00:22:13.563 "data_offset": 0, 00:22:13.563 "data_size": 65536 00:22:13.563 }, 00:22:13.563 { 00:22:13.563 "name": "BaseBdev2", 00:22:13.563 "uuid": "08173bd5-f052-41ef-b69b-f526b90f41d7", 00:22:13.563 "is_configured": true, 00:22:13.563 "data_offset": 0, 00:22:13.563 "data_size": 65536 00:22:13.563 }, 00:22:13.563 { 00:22:13.563 "name": "BaseBdev3", 00:22:13.563 "uuid": "89d7d1b9-07d9-400d-80b3-e9e1fae1a2fb", 00:22:13.563 "is_configured": true, 00:22:13.563 "data_offset": 0, 00:22:13.563 "data_size": 65536 00:22:13.563 }, 00:22:13.563 { 00:22:13.563 "name": "BaseBdev4", 00:22:13.563 "uuid": "7f5a9615-2270-4fe2-8c03-6f0eb9bd024d", 00:22:13.563 "is_configured": true, 00:22:13.563 "data_offset": 0, 00:22:13.563 "data_size": 65536 00:22:13.563 } 00:22:13.563 ] 00:22:13.563 }' 00:22:13.563 20:36:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:13.563 20:36:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:14.496 20:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:14.496 20:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:14.496 20:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.496 20:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:14.496 20:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:14.496 20:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:14.496 20:36:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:14.754 [2024-07-15 20:36:06.999691] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:14.754 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:14.754 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:14.754 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.754 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:15.011 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:15.011 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:15.011 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:15.269 [2024-07-15 20:36:07.499337] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:15.269 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:15.269 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:15.269 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.269 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:15.528 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:15.528 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:15.528 20:36:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:15.786 [2024-07-15 20:36:08.005079] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:15.786 [2024-07-15 20:36:08.005163] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:15.786 [2024-07-15 20:36:08.017823] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:15.786 [2024-07-15 20:36:08.017861] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:15.786 [2024-07-15 20:36:08.017873] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0b350 name Existed_Raid, state offline 00:22:15.786 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:15.786 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:15.786 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.786 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:16.043 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:16.043 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:16.043 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:16.043 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:16.043 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:16.043 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:16.301 BaseBdev2 00:22:16.301 20:36:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:16.301 20:36:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:16.301 20:36:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:16.301 20:36:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:16.301 20:36:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:16.301 20:36:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:16.301 20:36:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:16.560 20:36:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:16.818 [ 00:22:16.818 { 00:22:16.818 "name": "BaseBdev2", 00:22:16.818 "aliases": [ 00:22:16.818 "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86" 00:22:16.818 ], 00:22:16.818 "product_name": "Malloc disk", 00:22:16.818 "block_size": 512, 00:22:16.818 "num_blocks": 65536, 00:22:16.818 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:16.818 "assigned_rate_limits": { 00:22:16.818 "rw_ios_per_sec": 0, 00:22:16.818 "rw_mbytes_per_sec": 0, 00:22:16.818 "r_mbytes_per_sec": 0, 00:22:16.818 "w_mbytes_per_sec": 0 00:22:16.818 }, 00:22:16.818 "claimed": false, 00:22:16.818 "zoned": false, 00:22:16.818 "supported_io_types": { 00:22:16.818 "read": true, 00:22:16.818 "write": true, 00:22:16.818 "unmap": true, 00:22:16.818 "flush": true, 00:22:16.818 "reset": true, 00:22:16.818 "nvme_admin": false, 00:22:16.818 "nvme_io": false, 00:22:16.818 "nvme_io_md": false, 00:22:16.818 "write_zeroes": true, 00:22:16.818 "zcopy": true, 00:22:16.818 "get_zone_info": false, 00:22:16.818 "zone_management": false, 00:22:16.818 "zone_append": false, 00:22:16.818 "compare": false, 00:22:16.818 "compare_and_write": false, 00:22:16.818 "abort": true, 00:22:16.818 "seek_hole": false, 00:22:16.818 "seek_data": false, 00:22:16.818 "copy": true, 00:22:16.818 "nvme_iov_md": false 00:22:16.818 }, 00:22:16.818 "memory_domains": [ 00:22:16.818 { 00:22:16.818 "dma_device_id": "system", 00:22:16.818 "dma_device_type": 1 00:22:16.818 }, 00:22:16.818 { 00:22:16.818 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.818 "dma_device_type": 2 00:22:16.818 } 00:22:16.818 ], 00:22:16.818 "driver_specific": {} 00:22:16.818 } 00:22:16.818 ] 00:22:16.818 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:16.818 20:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:16.818 20:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:16.818 20:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:17.076 BaseBdev3 00:22:17.076 20:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:17.076 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:17.076 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:17.076 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:17.076 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:17.076 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:17.076 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:17.333 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:17.590 [ 00:22:17.590 { 00:22:17.590 "name": "BaseBdev3", 00:22:17.590 "aliases": [ 00:22:17.590 "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7" 00:22:17.590 ], 00:22:17.590 "product_name": "Malloc disk", 00:22:17.590 "block_size": 512, 00:22:17.590 "num_blocks": 65536, 00:22:17.590 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:17.590 "assigned_rate_limits": { 00:22:17.590 "rw_ios_per_sec": 0, 00:22:17.590 "rw_mbytes_per_sec": 0, 00:22:17.590 "r_mbytes_per_sec": 0, 00:22:17.590 "w_mbytes_per_sec": 0 00:22:17.590 }, 00:22:17.590 "claimed": false, 00:22:17.590 "zoned": false, 00:22:17.590 "supported_io_types": { 00:22:17.590 "read": true, 00:22:17.590 "write": true, 00:22:17.590 "unmap": true, 00:22:17.590 "flush": true, 00:22:17.590 "reset": true, 00:22:17.590 "nvme_admin": false, 00:22:17.590 "nvme_io": false, 00:22:17.590 "nvme_io_md": false, 00:22:17.590 "write_zeroes": true, 00:22:17.590 "zcopy": true, 00:22:17.590 "get_zone_info": false, 00:22:17.590 "zone_management": false, 00:22:17.590 "zone_append": false, 00:22:17.590 "compare": false, 00:22:17.590 "compare_and_write": false, 00:22:17.590 "abort": true, 00:22:17.590 "seek_hole": false, 00:22:17.590 "seek_data": false, 00:22:17.590 "copy": true, 00:22:17.590 "nvme_iov_md": false 00:22:17.590 }, 00:22:17.590 "memory_domains": [ 00:22:17.590 { 00:22:17.590 "dma_device_id": "system", 00:22:17.590 "dma_device_type": 1 00:22:17.590 }, 00:22:17.590 { 00:22:17.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:17.590 "dma_device_type": 2 00:22:17.590 } 00:22:17.590 ], 00:22:17.590 "driver_specific": {} 00:22:17.590 } 00:22:17.590 ] 00:22:17.590 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:17.590 20:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:17.590 20:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:17.590 20:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:17.590 BaseBdev4 00:22:17.848 20:36:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:17.848 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:17.848 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:17.848 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:17.848 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:17.848 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:17.848 20:36:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:18.105 20:36:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:18.105 [ 00:22:18.105 { 00:22:18.105 "name": "BaseBdev4", 00:22:18.105 "aliases": [ 00:22:18.105 "f3778a71-4036-47f8-aec1-5a75539de6ee" 00:22:18.105 ], 00:22:18.105 "product_name": "Malloc disk", 00:22:18.105 "block_size": 512, 00:22:18.105 "num_blocks": 65536, 00:22:18.105 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:18.105 "assigned_rate_limits": { 00:22:18.105 "rw_ios_per_sec": 0, 00:22:18.105 "rw_mbytes_per_sec": 0, 00:22:18.105 "r_mbytes_per_sec": 0, 00:22:18.105 "w_mbytes_per_sec": 0 00:22:18.105 }, 00:22:18.105 "claimed": false, 00:22:18.105 "zoned": false, 00:22:18.105 "supported_io_types": { 00:22:18.105 "read": true, 00:22:18.105 "write": true, 00:22:18.105 "unmap": true, 00:22:18.105 "flush": true, 00:22:18.105 "reset": true, 00:22:18.105 "nvme_admin": false, 00:22:18.105 "nvme_io": false, 00:22:18.105 "nvme_io_md": false, 00:22:18.105 "write_zeroes": true, 00:22:18.105 "zcopy": true, 00:22:18.105 "get_zone_info": false, 00:22:18.105 "zone_management": false, 00:22:18.105 "zone_append": false, 00:22:18.105 "compare": false, 00:22:18.105 "compare_and_write": false, 00:22:18.105 "abort": true, 00:22:18.105 "seek_hole": false, 00:22:18.105 "seek_data": false, 00:22:18.105 "copy": true, 00:22:18.105 "nvme_iov_md": false 00:22:18.105 }, 00:22:18.105 "memory_domains": [ 00:22:18.105 { 00:22:18.105 "dma_device_id": "system", 00:22:18.105 "dma_device_type": 1 00:22:18.105 }, 00:22:18.105 { 00:22:18.105 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:18.105 "dma_device_type": 2 00:22:18.105 } 00:22:18.105 ], 00:22:18.105 "driver_specific": {} 00:22:18.105 } 00:22:18.105 ] 00:22:18.105 20:36:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:18.105 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:18.105 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:18.105 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:18.363 [2024-07-15 20:36:10.700341] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:18.363 [2024-07-15 20:36:10.700383] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:18.364 [2024-07-15 20:36:10.700402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:18.364 [2024-07-15 20:36:10.701735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:18.364 [2024-07-15 20:36:10.701779] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.364 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:18.622 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.622 "name": "Existed_Raid", 00:22:18.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.622 "strip_size_kb": 0, 00:22:18.622 "state": "configuring", 00:22:18.622 "raid_level": "raid1", 00:22:18.622 "superblock": false, 00:22:18.622 "num_base_bdevs": 4, 00:22:18.622 "num_base_bdevs_discovered": 3, 00:22:18.622 "num_base_bdevs_operational": 4, 00:22:18.622 "base_bdevs_list": [ 00:22:18.622 { 00:22:18.622 "name": "BaseBdev1", 00:22:18.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.622 "is_configured": false, 00:22:18.622 "data_offset": 0, 00:22:18.622 "data_size": 0 00:22:18.622 }, 00:22:18.622 { 00:22:18.622 "name": "BaseBdev2", 00:22:18.622 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:18.622 "is_configured": true, 00:22:18.622 "data_offset": 0, 00:22:18.622 "data_size": 65536 00:22:18.622 }, 00:22:18.622 { 00:22:18.622 "name": "BaseBdev3", 00:22:18.622 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:18.622 "is_configured": true, 00:22:18.622 "data_offset": 0, 00:22:18.622 "data_size": 65536 00:22:18.622 }, 00:22:18.622 { 00:22:18.622 "name": "BaseBdev4", 00:22:18.622 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:18.622 "is_configured": true, 00:22:18.622 "data_offset": 0, 00:22:18.622 "data_size": 65536 00:22:18.622 } 00:22:18.622 ] 00:22:18.622 }' 00:22:18.622 20:36:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.622 20:36:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:19.555 [2024-07-15 20:36:11.827306] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.555 20:36:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:19.812 20:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.812 "name": "Existed_Raid", 00:22:19.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.812 "strip_size_kb": 0, 00:22:19.812 "state": "configuring", 00:22:19.812 "raid_level": "raid1", 00:22:19.812 "superblock": false, 00:22:19.812 "num_base_bdevs": 4, 00:22:19.812 "num_base_bdevs_discovered": 2, 00:22:19.812 "num_base_bdevs_operational": 4, 00:22:19.812 "base_bdevs_list": [ 00:22:19.812 { 00:22:19.812 "name": "BaseBdev1", 00:22:19.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.812 "is_configured": false, 00:22:19.812 "data_offset": 0, 00:22:19.812 "data_size": 0 00:22:19.812 }, 00:22:19.812 { 00:22:19.812 "name": null, 00:22:19.812 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:19.812 "is_configured": false, 00:22:19.812 "data_offset": 0, 00:22:19.812 "data_size": 65536 00:22:19.812 }, 00:22:19.812 { 00:22:19.812 "name": "BaseBdev3", 00:22:19.812 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:19.812 "is_configured": true, 00:22:19.812 "data_offset": 0, 00:22:19.812 "data_size": 65536 00:22:19.812 }, 00:22:19.812 { 00:22:19.812 "name": "BaseBdev4", 00:22:19.812 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:19.812 "is_configured": true, 00:22:19.812 "data_offset": 0, 00:22:19.812 "data_size": 65536 00:22:19.812 } 00:22:19.812 ] 00:22:19.812 }' 00:22:19.812 20:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.812 20:36:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:20.748 20:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.748 20:36:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:20.748 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:20.748 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:21.007 [2024-07-15 20:36:13.271737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:21.007 BaseBdev1 00:22:21.007 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:21.007 20:36:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:21.007 20:36:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:21.007 20:36:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:21.007 20:36:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:21.007 20:36:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:21.007 20:36:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:21.266 20:36:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:21.526 [ 00:22:21.526 { 00:22:21.526 "name": "BaseBdev1", 00:22:21.526 "aliases": [ 00:22:21.526 "e34a8775-d7e5-49e3-84bc-08d84fd26330" 00:22:21.526 ], 00:22:21.526 "product_name": "Malloc disk", 00:22:21.526 "block_size": 512, 00:22:21.526 "num_blocks": 65536, 00:22:21.526 "uuid": "e34a8775-d7e5-49e3-84bc-08d84fd26330", 00:22:21.526 "assigned_rate_limits": { 00:22:21.526 "rw_ios_per_sec": 0, 00:22:21.526 "rw_mbytes_per_sec": 0, 00:22:21.526 "r_mbytes_per_sec": 0, 00:22:21.526 "w_mbytes_per_sec": 0 00:22:21.526 }, 00:22:21.526 "claimed": true, 00:22:21.526 "claim_type": "exclusive_write", 00:22:21.526 "zoned": false, 00:22:21.526 "supported_io_types": { 00:22:21.526 "read": true, 00:22:21.526 "write": true, 00:22:21.526 "unmap": true, 00:22:21.526 "flush": true, 00:22:21.526 "reset": true, 00:22:21.526 "nvme_admin": false, 00:22:21.526 "nvme_io": false, 00:22:21.526 "nvme_io_md": false, 00:22:21.526 "write_zeroes": true, 00:22:21.526 "zcopy": true, 00:22:21.526 "get_zone_info": false, 00:22:21.526 "zone_management": false, 00:22:21.526 "zone_append": false, 00:22:21.526 "compare": false, 00:22:21.526 "compare_and_write": false, 00:22:21.526 "abort": true, 00:22:21.526 "seek_hole": false, 00:22:21.526 "seek_data": false, 00:22:21.526 "copy": true, 00:22:21.526 "nvme_iov_md": false 00:22:21.526 }, 00:22:21.526 "memory_domains": [ 00:22:21.526 { 00:22:21.526 "dma_device_id": "system", 00:22:21.526 "dma_device_type": 1 00:22:21.526 }, 00:22:21.526 { 00:22:21.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:21.526 "dma_device_type": 2 00:22:21.526 } 00:22:21.526 ], 00:22:21.526 "driver_specific": {} 00:22:21.526 } 00:22:21.526 ] 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.526 20:36:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:21.786 20:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.786 "name": "Existed_Raid", 00:22:21.786 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.786 "strip_size_kb": 0, 00:22:21.786 "state": "configuring", 00:22:21.786 "raid_level": "raid1", 00:22:21.786 "superblock": false, 00:22:21.786 "num_base_bdevs": 4, 00:22:21.786 "num_base_bdevs_discovered": 3, 00:22:21.786 "num_base_bdevs_operational": 4, 00:22:21.786 "base_bdevs_list": [ 00:22:21.786 { 00:22:21.786 "name": "BaseBdev1", 00:22:21.786 "uuid": "e34a8775-d7e5-49e3-84bc-08d84fd26330", 00:22:21.786 "is_configured": true, 00:22:21.786 "data_offset": 0, 00:22:21.786 "data_size": 65536 00:22:21.786 }, 00:22:21.786 { 00:22:21.786 "name": null, 00:22:21.786 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:21.786 "is_configured": false, 00:22:21.786 "data_offset": 0, 00:22:21.786 "data_size": 65536 00:22:21.786 }, 00:22:21.786 { 00:22:21.786 "name": "BaseBdev3", 00:22:21.786 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:21.786 "is_configured": true, 00:22:21.786 "data_offset": 0, 00:22:21.786 "data_size": 65536 00:22:21.786 }, 00:22:21.786 { 00:22:21.786 "name": "BaseBdev4", 00:22:21.786 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:21.786 "is_configured": true, 00:22:21.786 "data_offset": 0, 00:22:21.786 "data_size": 65536 00:22:21.786 } 00:22:21.786 ] 00:22:21.786 }' 00:22:21.786 20:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.786 20:36:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:22.353 20:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.353 20:36:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:22.921 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:22.921 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:23.181 [2024-07-15 20:36:15.445510] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:23.181 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.440 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.440 "name": "Existed_Raid", 00:22:23.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.440 "strip_size_kb": 0, 00:22:23.440 "state": "configuring", 00:22:23.440 "raid_level": "raid1", 00:22:23.440 "superblock": false, 00:22:23.440 "num_base_bdevs": 4, 00:22:23.440 "num_base_bdevs_discovered": 2, 00:22:23.440 "num_base_bdevs_operational": 4, 00:22:23.440 "base_bdevs_list": [ 00:22:23.440 { 00:22:23.440 "name": "BaseBdev1", 00:22:23.440 "uuid": "e34a8775-d7e5-49e3-84bc-08d84fd26330", 00:22:23.440 "is_configured": true, 00:22:23.440 "data_offset": 0, 00:22:23.440 "data_size": 65536 00:22:23.440 }, 00:22:23.440 { 00:22:23.440 "name": null, 00:22:23.440 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:23.440 "is_configured": false, 00:22:23.440 "data_offset": 0, 00:22:23.440 "data_size": 65536 00:22:23.440 }, 00:22:23.440 { 00:22:23.440 "name": null, 00:22:23.440 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:23.440 "is_configured": false, 00:22:23.440 "data_offset": 0, 00:22:23.440 "data_size": 65536 00:22:23.440 }, 00:22:23.440 { 00:22:23.440 "name": "BaseBdev4", 00:22:23.440 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:23.440 "is_configured": true, 00:22:23.440 "data_offset": 0, 00:22:23.440 "data_size": 65536 00:22:23.440 } 00:22:23.440 ] 00:22:23.440 }' 00:22:23.440 20:36:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.440 20:36:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:24.010 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.010 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:24.269 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:24.269 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:24.528 [2024-07-15 20:36:16.785075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.529 20:36:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:24.788 20:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:24.788 "name": "Existed_Raid", 00:22:24.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.788 "strip_size_kb": 0, 00:22:24.788 "state": "configuring", 00:22:24.788 "raid_level": "raid1", 00:22:24.788 "superblock": false, 00:22:24.788 "num_base_bdevs": 4, 00:22:24.788 "num_base_bdevs_discovered": 3, 00:22:24.788 "num_base_bdevs_operational": 4, 00:22:24.788 "base_bdevs_list": [ 00:22:24.788 { 00:22:24.788 "name": "BaseBdev1", 00:22:24.788 "uuid": "e34a8775-d7e5-49e3-84bc-08d84fd26330", 00:22:24.788 "is_configured": true, 00:22:24.788 "data_offset": 0, 00:22:24.788 "data_size": 65536 00:22:24.788 }, 00:22:24.788 { 00:22:24.788 "name": null, 00:22:24.788 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:24.788 "is_configured": false, 00:22:24.788 "data_offset": 0, 00:22:24.788 "data_size": 65536 00:22:24.788 }, 00:22:24.788 { 00:22:24.788 "name": "BaseBdev3", 00:22:24.788 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:24.788 "is_configured": true, 00:22:24.788 "data_offset": 0, 00:22:24.788 "data_size": 65536 00:22:24.788 }, 00:22:24.788 { 00:22:24.788 "name": "BaseBdev4", 00:22:24.788 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:24.788 "is_configured": true, 00:22:24.788 "data_offset": 0, 00:22:24.788 "data_size": 65536 00:22:24.788 } 00:22:24.788 ] 00:22:24.788 }' 00:22:24.788 20:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:24.788 20:36:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:25.357 20:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.357 20:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:25.616 20:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:25.616 20:36:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:25.913 [2024-07-15 20:36:18.148729] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.913 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:26.179 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:26.179 "name": "Existed_Raid", 00:22:26.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:26.179 "strip_size_kb": 0, 00:22:26.179 "state": "configuring", 00:22:26.179 "raid_level": "raid1", 00:22:26.179 "superblock": false, 00:22:26.179 "num_base_bdevs": 4, 00:22:26.179 "num_base_bdevs_discovered": 2, 00:22:26.179 "num_base_bdevs_operational": 4, 00:22:26.179 "base_bdevs_list": [ 00:22:26.179 { 00:22:26.179 "name": null, 00:22:26.179 "uuid": "e34a8775-d7e5-49e3-84bc-08d84fd26330", 00:22:26.179 "is_configured": false, 00:22:26.179 "data_offset": 0, 00:22:26.179 "data_size": 65536 00:22:26.179 }, 00:22:26.179 { 00:22:26.179 "name": null, 00:22:26.179 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:26.179 "is_configured": false, 00:22:26.179 "data_offset": 0, 00:22:26.179 "data_size": 65536 00:22:26.179 }, 00:22:26.179 { 00:22:26.179 "name": "BaseBdev3", 00:22:26.179 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:26.179 "is_configured": true, 00:22:26.179 "data_offset": 0, 00:22:26.179 "data_size": 65536 00:22:26.179 }, 00:22:26.179 { 00:22:26.179 "name": "BaseBdev4", 00:22:26.179 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:26.179 "is_configured": true, 00:22:26.179 "data_offset": 0, 00:22:26.179 "data_size": 65536 00:22:26.179 } 00:22:26.179 ] 00:22:26.179 }' 00:22:26.179 20:36:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:26.179 20:36:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:27.117 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.117 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:27.376 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:27.376 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:27.635 [2024-07-15 20:36:19.781411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.635 20:36:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:27.895 20:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:27.895 "name": "Existed_Raid", 00:22:27.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:27.895 "strip_size_kb": 0, 00:22:27.895 "state": "configuring", 00:22:27.895 "raid_level": "raid1", 00:22:27.895 "superblock": false, 00:22:27.895 "num_base_bdevs": 4, 00:22:27.895 "num_base_bdevs_discovered": 3, 00:22:27.895 "num_base_bdevs_operational": 4, 00:22:27.895 "base_bdevs_list": [ 00:22:27.895 { 00:22:27.895 "name": null, 00:22:27.895 "uuid": "e34a8775-d7e5-49e3-84bc-08d84fd26330", 00:22:27.895 "is_configured": false, 00:22:27.895 "data_offset": 0, 00:22:27.895 "data_size": 65536 00:22:27.895 }, 00:22:27.895 { 00:22:27.895 "name": "BaseBdev2", 00:22:27.895 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:27.895 "is_configured": true, 00:22:27.895 "data_offset": 0, 00:22:27.895 "data_size": 65536 00:22:27.895 }, 00:22:27.895 { 00:22:27.895 "name": "BaseBdev3", 00:22:27.895 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:27.895 "is_configured": true, 00:22:27.895 "data_offset": 0, 00:22:27.895 "data_size": 65536 00:22:27.895 }, 00:22:27.895 { 00:22:27.895 "name": "BaseBdev4", 00:22:27.895 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:27.895 "is_configured": true, 00:22:27.895 "data_offset": 0, 00:22:27.895 "data_size": 65536 00:22:27.895 } 00:22:27.895 ] 00:22:27.895 }' 00:22:27.895 20:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:27.895 20:36:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:28.463 20:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.463 20:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:28.722 20:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:28.722 20:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.722 20:36:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:28.983 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e34a8775-d7e5-49e3-84bc-08d84fd26330 00:22:29.242 [2024-07-15 20:36:21.437205] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:29.242 [2024-07-15 20:36:21.437249] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f09610 00:22:29.242 [2024-07-15 20:36:21.437258] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:29.242 [2024-07-15 20:36:21.437452] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f0aa70 00:22:29.242 [2024-07-15 20:36:21.437578] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f09610 00:22:29.242 [2024-07-15 20:36:21.437589] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f09610 00:22:29.242 [2024-07-15 20:36:21.437760] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:29.242 NewBaseBdev 00:22:29.242 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:29.242 20:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:29.242 20:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:29.242 20:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:29.242 20:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:29.242 20:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:29.242 20:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:29.502 20:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:29.761 [ 00:22:29.761 { 00:22:29.761 "name": "NewBaseBdev", 00:22:29.761 "aliases": [ 00:22:29.761 "e34a8775-d7e5-49e3-84bc-08d84fd26330" 00:22:29.761 ], 00:22:29.761 "product_name": "Malloc disk", 00:22:29.761 "block_size": 512, 00:22:29.761 "num_blocks": 65536, 00:22:29.761 "uuid": "e34a8775-d7e5-49e3-84bc-08d84fd26330", 00:22:29.761 "assigned_rate_limits": { 00:22:29.761 "rw_ios_per_sec": 0, 00:22:29.761 "rw_mbytes_per_sec": 0, 00:22:29.761 "r_mbytes_per_sec": 0, 00:22:29.761 "w_mbytes_per_sec": 0 00:22:29.761 }, 00:22:29.761 "claimed": true, 00:22:29.761 "claim_type": "exclusive_write", 00:22:29.761 "zoned": false, 00:22:29.761 "supported_io_types": { 00:22:29.761 "read": true, 00:22:29.761 "write": true, 00:22:29.761 "unmap": true, 00:22:29.761 "flush": true, 00:22:29.761 "reset": true, 00:22:29.761 "nvme_admin": false, 00:22:29.761 "nvme_io": false, 00:22:29.761 "nvme_io_md": false, 00:22:29.761 "write_zeroes": true, 00:22:29.761 "zcopy": true, 00:22:29.761 "get_zone_info": false, 00:22:29.761 "zone_management": false, 00:22:29.761 "zone_append": false, 00:22:29.761 "compare": false, 00:22:29.761 "compare_and_write": false, 00:22:29.761 "abort": true, 00:22:29.761 "seek_hole": false, 00:22:29.761 "seek_data": false, 00:22:29.761 "copy": true, 00:22:29.761 "nvme_iov_md": false 00:22:29.761 }, 00:22:29.761 "memory_domains": [ 00:22:29.761 { 00:22:29.761 "dma_device_id": "system", 00:22:29.761 "dma_device_type": 1 00:22:29.762 }, 00:22:29.762 { 00:22:29.762 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:29.762 "dma_device_type": 2 00:22:29.762 } 00:22:29.762 ], 00:22:29.762 "driver_specific": {} 00:22:29.762 } 00:22:29.762 ] 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:29.762 20:36:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.762 20:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.762 20:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:30.022 20:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:30.022 "name": "Existed_Raid", 00:22:30.022 "uuid": "3735e051-de42-4175-9e2f-e376a30bb000", 00:22:30.022 "strip_size_kb": 0, 00:22:30.022 "state": "online", 00:22:30.022 "raid_level": "raid1", 00:22:30.022 "superblock": false, 00:22:30.022 "num_base_bdevs": 4, 00:22:30.022 "num_base_bdevs_discovered": 4, 00:22:30.022 "num_base_bdevs_operational": 4, 00:22:30.022 "base_bdevs_list": [ 00:22:30.022 { 00:22:30.022 "name": "NewBaseBdev", 00:22:30.022 "uuid": "e34a8775-d7e5-49e3-84bc-08d84fd26330", 00:22:30.022 "is_configured": true, 00:22:30.022 "data_offset": 0, 00:22:30.022 "data_size": 65536 00:22:30.022 }, 00:22:30.022 { 00:22:30.022 "name": "BaseBdev2", 00:22:30.022 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:30.022 "is_configured": true, 00:22:30.022 "data_offset": 0, 00:22:30.022 "data_size": 65536 00:22:30.022 }, 00:22:30.022 { 00:22:30.022 "name": "BaseBdev3", 00:22:30.022 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:30.022 "is_configured": true, 00:22:30.022 "data_offset": 0, 00:22:30.022 "data_size": 65536 00:22:30.022 }, 00:22:30.022 { 00:22:30.022 "name": "BaseBdev4", 00:22:30.022 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:30.022 "is_configured": true, 00:22:30.022 "data_offset": 0, 00:22:30.022 "data_size": 65536 00:22:30.022 } 00:22:30.022 ] 00:22:30.022 }' 00:22:30.022 20:36:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:30.022 20:36:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:30.961 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:30.961 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:30.961 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:30.961 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:30.961 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:30.961 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:30.961 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:30.961 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:31.221 [2024-07-15 20:36:23.370681] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:31.221 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:31.221 "name": "Existed_Raid", 00:22:31.221 "aliases": [ 00:22:31.221 "3735e051-de42-4175-9e2f-e376a30bb000" 00:22:31.221 ], 00:22:31.221 "product_name": "Raid Volume", 00:22:31.221 "block_size": 512, 00:22:31.221 "num_blocks": 65536, 00:22:31.221 "uuid": "3735e051-de42-4175-9e2f-e376a30bb000", 00:22:31.221 "assigned_rate_limits": { 00:22:31.221 "rw_ios_per_sec": 0, 00:22:31.221 "rw_mbytes_per_sec": 0, 00:22:31.221 "r_mbytes_per_sec": 0, 00:22:31.221 "w_mbytes_per_sec": 0 00:22:31.221 }, 00:22:31.221 "claimed": false, 00:22:31.221 "zoned": false, 00:22:31.221 "supported_io_types": { 00:22:31.221 "read": true, 00:22:31.221 "write": true, 00:22:31.221 "unmap": false, 00:22:31.221 "flush": false, 00:22:31.221 "reset": true, 00:22:31.221 "nvme_admin": false, 00:22:31.221 "nvme_io": false, 00:22:31.221 "nvme_io_md": false, 00:22:31.221 "write_zeroes": true, 00:22:31.221 "zcopy": false, 00:22:31.221 "get_zone_info": false, 00:22:31.221 "zone_management": false, 00:22:31.221 "zone_append": false, 00:22:31.221 "compare": false, 00:22:31.221 "compare_and_write": false, 00:22:31.221 "abort": false, 00:22:31.221 "seek_hole": false, 00:22:31.221 "seek_data": false, 00:22:31.221 "copy": false, 00:22:31.221 "nvme_iov_md": false 00:22:31.221 }, 00:22:31.221 "memory_domains": [ 00:22:31.221 { 00:22:31.221 "dma_device_id": "system", 00:22:31.221 "dma_device_type": 1 00:22:31.221 }, 00:22:31.221 { 00:22:31.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:31.221 "dma_device_type": 2 00:22:31.221 }, 00:22:31.221 { 00:22:31.221 "dma_device_id": "system", 00:22:31.221 "dma_device_type": 1 00:22:31.221 }, 00:22:31.221 { 00:22:31.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:31.221 "dma_device_type": 2 00:22:31.221 }, 00:22:31.221 { 00:22:31.221 "dma_device_id": "system", 00:22:31.221 "dma_device_type": 1 00:22:31.221 }, 00:22:31.221 { 00:22:31.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:31.221 "dma_device_type": 2 00:22:31.221 }, 00:22:31.221 { 00:22:31.221 "dma_device_id": "system", 00:22:31.221 "dma_device_type": 1 00:22:31.221 }, 00:22:31.221 { 00:22:31.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:31.221 "dma_device_type": 2 00:22:31.221 } 00:22:31.221 ], 00:22:31.221 "driver_specific": { 00:22:31.221 "raid": { 00:22:31.221 "uuid": "3735e051-de42-4175-9e2f-e376a30bb000", 00:22:31.221 "strip_size_kb": 0, 00:22:31.221 "state": "online", 00:22:31.221 "raid_level": "raid1", 00:22:31.221 "superblock": false, 00:22:31.221 "num_base_bdevs": 4, 00:22:31.221 "num_base_bdevs_discovered": 4, 00:22:31.221 "num_base_bdevs_operational": 4, 00:22:31.221 "base_bdevs_list": [ 00:22:31.221 { 00:22:31.221 "name": "NewBaseBdev", 00:22:31.221 "uuid": "e34a8775-d7e5-49e3-84bc-08d84fd26330", 00:22:31.221 "is_configured": true, 00:22:31.221 "data_offset": 0, 00:22:31.221 "data_size": 65536 00:22:31.221 }, 00:22:31.221 { 00:22:31.221 "name": "BaseBdev2", 00:22:31.221 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:31.221 "is_configured": true, 00:22:31.221 "data_offset": 0, 00:22:31.221 "data_size": 65536 00:22:31.221 }, 00:22:31.221 { 00:22:31.221 "name": "BaseBdev3", 00:22:31.221 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:31.221 "is_configured": true, 00:22:31.221 "data_offset": 0, 00:22:31.221 "data_size": 65536 00:22:31.221 }, 00:22:31.221 { 00:22:31.221 "name": "BaseBdev4", 00:22:31.221 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:31.221 "is_configured": true, 00:22:31.221 "data_offset": 0, 00:22:31.221 "data_size": 65536 00:22:31.221 } 00:22:31.221 ] 00:22:31.221 } 00:22:31.221 } 00:22:31.221 }' 00:22:31.221 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:31.221 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:31.221 BaseBdev2 00:22:31.221 BaseBdev3 00:22:31.221 BaseBdev4' 00:22:31.221 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:31.221 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:31.221 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:31.789 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:31.789 "name": "NewBaseBdev", 00:22:31.789 "aliases": [ 00:22:31.789 "e34a8775-d7e5-49e3-84bc-08d84fd26330" 00:22:31.789 ], 00:22:31.789 "product_name": "Malloc disk", 00:22:31.789 "block_size": 512, 00:22:31.789 "num_blocks": 65536, 00:22:31.789 "uuid": "e34a8775-d7e5-49e3-84bc-08d84fd26330", 00:22:31.789 "assigned_rate_limits": { 00:22:31.789 "rw_ios_per_sec": 0, 00:22:31.789 "rw_mbytes_per_sec": 0, 00:22:31.789 "r_mbytes_per_sec": 0, 00:22:31.789 "w_mbytes_per_sec": 0 00:22:31.789 }, 00:22:31.789 "claimed": true, 00:22:31.789 "claim_type": "exclusive_write", 00:22:31.789 "zoned": false, 00:22:31.789 "supported_io_types": { 00:22:31.789 "read": true, 00:22:31.789 "write": true, 00:22:31.789 "unmap": true, 00:22:31.789 "flush": true, 00:22:31.789 "reset": true, 00:22:31.789 "nvme_admin": false, 00:22:31.789 "nvme_io": false, 00:22:31.789 "nvme_io_md": false, 00:22:31.789 "write_zeroes": true, 00:22:31.789 "zcopy": true, 00:22:31.789 "get_zone_info": false, 00:22:31.789 "zone_management": false, 00:22:31.789 "zone_append": false, 00:22:31.789 "compare": false, 00:22:31.789 "compare_and_write": false, 00:22:31.789 "abort": true, 00:22:31.789 "seek_hole": false, 00:22:31.789 "seek_data": false, 00:22:31.789 "copy": true, 00:22:31.789 "nvme_iov_md": false 00:22:31.789 }, 00:22:31.789 "memory_domains": [ 00:22:31.789 { 00:22:31.789 "dma_device_id": "system", 00:22:31.789 "dma_device_type": 1 00:22:31.789 }, 00:22:31.789 { 00:22:31.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:31.789 "dma_device_type": 2 00:22:31.789 } 00:22:31.789 ], 00:22:31.789 "driver_specific": {} 00:22:31.789 }' 00:22:31.789 20:36:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:31.790 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:31.790 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:31.790 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:31.790 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:32.049 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:32.049 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:32.049 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:32.049 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:32.049 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:32.049 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:32.308 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:32.308 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:32.308 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:32.308 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:32.308 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:32.308 "name": "BaseBdev2", 00:22:32.308 "aliases": [ 00:22:32.308 "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86" 00:22:32.308 ], 00:22:32.308 "product_name": "Malloc disk", 00:22:32.308 "block_size": 512, 00:22:32.308 "num_blocks": 65536, 00:22:32.308 "uuid": "ffa1012d-3d6b-4c1c-9d73-1689ac9efe86", 00:22:32.308 "assigned_rate_limits": { 00:22:32.308 "rw_ios_per_sec": 0, 00:22:32.308 "rw_mbytes_per_sec": 0, 00:22:32.308 "r_mbytes_per_sec": 0, 00:22:32.308 "w_mbytes_per_sec": 0 00:22:32.308 }, 00:22:32.308 "claimed": true, 00:22:32.308 "claim_type": "exclusive_write", 00:22:32.308 "zoned": false, 00:22:32.308 "supported_io_types": { 00:22:32.308 "read": true, 00:22:32.308 "write": true, 00:22:32.308 "unmap": true, 00:22:32.308 "flush": true, 00:22:32.308 "reset": true, 00:22:32.308 "nvme_admin": false, 00:22:32.308 "nvme_io": false, 00:22:32.308 "nvme_io_md": false, 00:22:32.308 "write_zeroes": true, 00:22:32.308 "zcopy": true, 00:22:32.308 "get_zone_info": false, 00:22:32.308 "zone_management": false, 00:22:32.308 "zone_append": false, 00:22:32.308 "compare": false, 00:22:32.308 "compare_and_write": false, 00:22:32.308 "abort": true, 00:22:32.308 "seek_hole": false, 00:22:32.308 "seek_data": false, 00:22:32.308 "copy": true, 00:22:32.308 "nvme_iov_md": false 00:22:32.308 }, 00:22:32.308 "memory_domains": [ 00:22:32.308 { 00:22:32.308 "dma_device_id": "system", 00:22:32.308 "dma_device_type": 1 00:22:32.308 }, 00:22:32.308 { 00:22:32.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:32.308 "dma_device_type": 2 00:22:32.308 } 00:22:32.308 ], 00:22:32.308 "driver_specific": {} 00:22:32.308 }' 00:22:32.308 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:32.308 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:32.567 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:32.567 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:32.567 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:32.567 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:32.567 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:32.567 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:32.567 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:32.567 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:32.567 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:32.827 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:32.827 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:32.827 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:32.827 20:36:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:32.827 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:32.827 "name": "BaseBdev3", 00:22:32.827 "aliases": [ 00:22:32.827 "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7" 00:22:32.827 ], 00:22:32.827 "product_name": "Malloc disk", 00:22:32.827 "block_size": 512, 00:22:32.827 "num_blocks": 65536, 00:22:32.827 "uuid": "8cd0eff0-165c-4b15-8ce2-02d43cfbabd7", 00:22:32.827 "assigned_rate_limits": { 00:22:32.827 "rw_ios_per_sec": 0, 00:22:32.827 "rw_mbytes_per_sec": 0, 00:22:32.827 "r_mbytes_per_sec": 0, 00:22:32.827 "w_mbytes_per_sec": 0 00:22:32.827 }, 00:22:32.827 "claimed": true, 00:22:32.827 "claim_type": "exclusive_write", 00:22:32.827 "zoned": false, 00:22:32.827 "supported_io_types": { 00:22:32.827 "read": true, 00:22:32.827 "write": true, 00:22:32.827 "unmap": true, 00:22:32.827 "flush": true, 00:22:32.827 "reset": true, 00:22:32.827 "nvme_admin": false, 00:22:32.827 "nvme_io": false, 00:22:32.827 "nvme_io_md": false, 00:22:32.827 "write_zeroes": true, 00:22:32.827 "zcopy": true, 00:22:32.827 "get_zone_info": false, 00:22:32.827 "zone_management": false, 00:22:32.827 "zone_append": false, 00:22:32.827 "compare": false, 00:22:32.827 "compare_and_write": false, 00:22:32.827 "abort": true, 00:22:32.827 "seek_hole": false, 00:22:32.827 "seek_data": false, 00:22:32.827 "copy": true, 00:22:32.827 "nvme_iov_md": false 00:22:32.827 }, 00:22:32.827 "memory_domains": [ 00:22:32.827 { 00:22:32.827 "dma_device_id": "system", 00:22:32.827 "dma_device_type": 1 00:22:32.827 }, 00:22:32.827 { 00:22:32.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:32.827 "dma_device_type": 2 00:22:32.827 } 00:22:32.827 ], 00:22:32.827 "driver_specific": {} 00:22:32.827 }' 00:22:32.827 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.086 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.086 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:33.086 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.086 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.086 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:33.086 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:33.086 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:33.086 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:33.086 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:33.349 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:33.349 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:33.349 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:33.349 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:33.349 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:33.610 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:33.610 "name": "BaseBdev4", 00:22:33.610 "aliases": [ 00:22:33.610 "f3778a71-4036-47f8-aec1-5a75539de6ee" 00:22:33.610 ], 00:22:33.610 "product_name": "Malloc disk", 00:22:33.610 "block_size": 512, 00:22:33.610 "num_blocks": 65536, 00:22:33.610 "uuid": "f3778a71-4036-47f8-aec1-5a75539de6ee", 00:22:33.610 "assigned_rate_limits": { 00:22:33.610 "rw_ios_per_sec": 0, 00:22:33.610 "rw_mbytes_per_sec": 0, 00:22:33.610 "r_mbytes_per_sec": 0, 00:22:33.610 "w_mbytes_per_sec": 0 00:22:33.610 }, 00:22:33.610 "claimed": true, 00:22:33.610 "claim_type": "exclusive_write", 00:22:33.610 "zoned": false, 00:22:33.610 "supported_io_types": { 00:22:33.610 "read": true, 00:22:33.610 "write": true, 00:22:33.610 "unmap": true, 00:22:33.610 "flush": true, 00:22:33.610 "reset": true, 00:22:33.610 "nvme_admin": false, 00:22:33.610 "nvme_io": false, 00:22:33.610 "nvme_io_md": false, 00:22:33.610 "write_zeroes": true, 00:22:33.610 "zcopy": true, 00:22:33.610 "get_zone_info": false, 00:22:33.610 "zone_management": false, 00:22:33.610 "zone_append": false, 00:22:33.610 "compare": false, 00:22:33.610 "compare_and_write": false, 00:22:33.610 "abort": true, 00:22:33.610 "seek_hole": false, 00:22:33.610 "seek_data": false, 00:22:33.610 "copy": true, 00:22:33.610 "nvme_iov_md": false 00:22:33.610 }, 00:22:33.610 "memory_domains": [ 00:22:33.610 { 00:22:33.610 "dma_device_id": "system", 00:22:33.610 "dma_device_type": 1 00:22:33.610 }, 00:22:33.610 { 00:22:33.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.610 "dma_device_type": 2 00:22:33.610 } 00:22:33.610 ], 00:22:33.610 "driver_specific": {} 00:22:33.610 }' 00:22:33.610 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.610 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.610 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:33.610 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.610 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.610 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:33.610 20:36:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:33.869 20:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:33.869 20:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:33.869 20:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:33.869 20:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:33.869 20:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:33.869 20:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:34.129 [2024-07-15 20:36:26.370339] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:34.129 [2024-07-15 20:36:26.370366] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:34.129 [2024-07-15 20:36:26.370419] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:34.129 [2024-07-15 20:36:26.370705] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:34.129 [2024-07-15 20:36:26.370717] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f09610 name Existed_Raid, state offline 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1444398 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1444398 ']' 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1444398 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1444398 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1444398' 00:22:34.129 killing process with pid 1444398 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1444398 00:22:34.129 [2024-07-15 20:36:26.445976] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:34.129 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1444398 00:22:34.129 [2024-07-15 20:36:26.486020] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:34.389 20:36:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:22:34.389 00:22:34.389 real 0m34.588s 00:22:34.389 user 1m3.642s 00:22:34.389 sys 0m6.081s 00:22:34.389 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:34.389 20:36:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:34.389 ************************************ 00:22:34.389 END TEST raid_state_function_test 00:22:34.389 ************************************ 00:22:34.389 20:36:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:34.389 20:36:26 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:22:34.389 20:36:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:34.389 20:36:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:34.389 20:36:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:34.648 ************************************ 00:22:34.648 START TEST raid_state_function_test_sb 00:22:34.648 ************************************ 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1449996 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1449996' 00:22:34.648 Process raid pid: 1449996 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1449996 /var/tmp/spdk-raid.sock 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1449996 ']' 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:34.648 20:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:34.649 20:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:34.649 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:34.649 20:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:34.649 20:36:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:34.649 [2024-07-15 20:36:26.866410] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:22:34.649 [2024-07-15 20:36:26.866476] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:34.649 [2024-07-15 20:36:26.986522] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:34.908 [2024-07-15 20:36:27.093780] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:34.908 [2024-07-15 20:36:27.161520] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:34.908 [2024-07-15 20:36:27.161550] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:35.474 20:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:35.474 20:36:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:22:35.474 20:36:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:35.732 [2024-07-15 20:36:28.024595] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:35.732 [2024-07-15 20:36:28.024636] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:35.732 [2024-07-15 20:36:28.024647] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:35.732 [2024-07-15 20:36:28.024659] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:35.732 [2024-07-15 20:36:28.024668] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:35.732 [2024-07-15 20:36:28.024679] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:35.732 [2024-07-15 20:36:28.024687] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:35.732 [2024-07-15 20:36:28.024698] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.732 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:35.990 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:35.990 "name": "Existed_Raid", 00:22:35.990 "uuid": "0d39474c-90b2-4ce0-b6c3-a6bc40b44a0e", 00:22:35.990 "strip_size_kb": 0, 00:22:35.990 "state": "configuring", 00:22:35.990 "raid_level": "raid1", 00:22:35.990 "superblock": true, 00:22:35.990 "num_base_bdevs": 4, 00:22:35.990 "num_base_bdevs_discovered": 0, 00:22:35.990 "num_base_bdevs_operational": 4, 00:22:35.990 "base_bdevs_list": [ 00:22:35.990 { 00:22:35.990 "name": "BaseBdev1", 00:22:35.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.990 "is_configured": false, 00:22:35.990 "data_offset": 0, 00:22:35.990 "data_size": 0 00:22:35.990 }, 00:22:35.990 { 00:22:35.990 "name": "BaseBdev2", 00:22:35.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.990 "is_configured": false, 00:22:35.990 "data_offset": 0, 00:22:35.990 "data_size": 0 00:22:35.990 }, 00:22:35.990 { 00:22:35.990 "name": "BaseBdev3", 00:22:35.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.990 "is_configured": false, 00:22:35.990 "data_offset": 0, 00:22:35.990 "data_size": 0 00:22:35.990 }, 00:22:35.990 { 00:22:35.990 "name": "BaseBdev4", 00:22:35.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.990 "is_configured": false, 00:22:35.990 "data_offset": 0, 00:22:35.990 "data_size": 0 00:22:35.990 } 00:22:35.990 ] 00:22:35.990 }' 00:22:35.990 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:35.990 20:36:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:36.556 20:36:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:36.816 [2024-07-15 20:36:29.095281] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:36.816 [2024-07-15 20:36:29.095315] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2718aa0 name Existed_Raid, state configuring 00:22:36.816 20:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:37.073 [2024-07-15 20:36:29.384072] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:37.073 [2024-07-15 20:36:29.384101] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:37.073 [2024-07-15 20:36:29.384111] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:37.073 [2024-07-15 20:36:29.384122] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:37.073 [2024-07-15 20:36:29.384131] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:37.073 [2024-07-15 20:36:29.384142] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:37.073 [2024-07-15 20:36:29.384151] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:37.073 [2024-07-15 20:36:29.384161] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:37.073 20:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:37.331 [2024-07-15 20:36:29.638589] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:37.331 BaseBdev1 00:22:37.331 20:36:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:37.331 20:36:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:37.331 20:36:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:37.331 20:36:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:37.331 20:36:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:37.331 20:36:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:37.331 20:36:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:37.589 20:36:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:37.846 [ 00:22:37.846 { 00:22:37.846 "name": "BaseBdev1", 00:22:37.846 "aliases": [ 00:22:37.846 "e5589618-fd8c-4b04-b8c8-7e39fdab6f6b" 00:22:37.846 ], 00:22:37.846 "product_name": "Malloc disk", 00:22:37.846 "block_size": 512, 00:22:37.846 "num_blocks": 65536, 00:22:37.846 "uuid": "e5589618-fd8c-4b04-b8c8-7e39fdab6f6b", 00:22:37.846 "assigned_rate_limits": { 00:22:37.846 "rw_ios_per_sec": 0, 00:22:37.846 "rw_mbytes_per_sec": 0, 00:22:37.846 "r_mbytes_per_sec": 0, 00:22:37.846 "w_mbytes_per_sec": 0 00:22:37.846 }, 00:22:37.846 "claimed": true, 00:22:37.846 "claim_type": "exclusive_write", 00:22:37.846 "zoned": false, 00:22:37.846 "supported_io_types": { 00:22:37.846 "read": true, 00:22:37.846 "write": true, 00:22:37.846 "unmap": true, 00:22:37.846 "flush": true, 00:22:37.846 "reset": true, 00:22:37.846 "nvme_admin": false, 00:22:37.846 "nvme_io": false, 00:22:37.846 "nvme_io_md": false, 00:22:37.846 "write_zeroes": true, 00:22:37.846 "zcopy": true, 00:22:37.846 "get_zone_info": false, 00:22:37.846 "zone_management": false, 00:22:37.846 "zone_append": false, 00:22:37.846 "compare": false, 00:22:37.846 "compare_and_write": false, 00:22:37.846 "abort": true, 00:22:37.846 "seek_hole": false, 00:22:37.846 "seek_data": false, 00:22:37.846 "copy": true, 00:22:37.846 "nvme_iov_md": false 00:22:37.847 }, 00:22:37.847 "memory_domains": [ 00:22:37.847 { 00:22:37.847 "dma_device_id": "system", 00:22:37.847 "dma_device_type": 1 00:22:37.847 }, 00:22:37.847 { 00:22:37.847 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:37.847 "dma_device_type": 2 00:22:37.847 } 00:22:37.847 ], 00:22:37.847 "driver_specific": {} 00:22:37.847 } 00:22:37.847 ] 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.104 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:38.363 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:38.363 "name": "Existed_Raid", 00:22:38.363 "uuid": "fec38f76-97ee-41c0-b49b-221a8e1bad36", 00:22:38.363 "strip_size_kb": 0, 00:22:38.363 "state": "configuring", 00:22:38.363 "raid_level": "raid1", 00:22:38.363 "superblock": true, 00:22:38.363 "num_base_bdevs": 4, 00:22:38.363 "num_base_bdevs_discovered": 1, 00:22:38.363 "num_base_bdevs_operational": 4, 00:22:38.363 "base_bdevs_list": [ 00:22:38.363 { 00:22:38.363 "name": "BaseBdev1", 00:22:38.363 "uuid": "e5589618-fd8c-4b04-b8c8-7e39fdab6f6b", 00:22:38.363 "is_configured": true, 00:22:38.363 "data_offset": 2048, 00:22:38.363 "data_size": 63488 00:22:38.363 }, 00:22:38.363 { 00:22:38.363 "name": "BaseBdev2", 00:22:38.363 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:38.363 "is_configured": false, 00:22:38.363 "data_offset": 0, 00:22:38.363 "data_size": 0 00:22:38.363 }, 00:22:38.363 { 00:22:38.363 "name": "BaseBdev3", 00:22:38.363 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:38.363 "is_configured": false, 00:22:38.363 "data_offset": 0, 00:22:38.363 "data_size": 0 00:22:38.363 }, 00:22:38.363 { 00:22:38.363 "name": "BaseBdev4", 00:22:38.363 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:38.363 "is_configured": false, 00:22:38.363 "data_offset": 0, 00:22:38.363 "data_size": 0 00:22:38.363 } 00:22:38.363 ] 00:22:38.363 }' 00:22:38.363 20:36:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:38.363 20:36:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:38.931 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:39.190 [2024-07-15 20:36:31.311097] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:39.190 [2024-07-15 20:36:31.311136] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2718310 name Existed_Raid, state configuring 00:22:39.190 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:39.190 [2024-07-15 20:36:31.559794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:39.190 [2024-07-15 20:36:31.561239] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:39.190 [2024-07-15 20:36:31.561272] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:39.190 [2024-07-15 20:36:31.561282] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:39.190 [2024-07-15 20:36:31.561295] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:39.190 [2024-07-15 20:36:31.561304] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:39.190 [2024-07-15 20:36:31.561315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:39.449 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:39.449 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.450 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:39.709 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:39.709 "name": "Existed_Raid", 00:22:39.709 "uuid": "a9481c4d-5090-4cae-873b-ad2d8ad05588", 00:22:39.709 "strip_size_kb": 0, 00:22:39.709 "state": "configuring", 00:22:39.709 "raid_level": "raid1", 00:22:39.709 "superblock": true, 00:22:39.709 "num_base_bdevs": 4, 00:22:39.709 "num_base_bdevs_discovered": 1, 00:22:39.709 "num_base_bdevs_operational": 4, 00:22:39.709 "base_bdevs_list": [ 00:22:39.709 { 00:22:39.709 "name": "BaseBdev1", 00:22:39.709 "uuid": "e5589618-fd8c-4b04-b8c8-7e39fdab6f6b", 00:22:39.709 "is_configured": true, 00:22:39.709 "data_offset": 2048, 00:22:39.709 "data_size": 63488 00:22:39.709 }, 00:22:39.709 { 00:22:39.709 "name": "BaseBdev2", 00:22:39.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.709 "is_configured": false, 00:22:39.709 "data_offset": 0, 00:22:39.709 "data_size": 0 00:22:39.709 }, 00:22:39.709 { 00:22:39.709 "name": "BaseBdev3", 00:22:39.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.709 "is_configured": false, 00:22:39.709 "data_offset": 0, 00:22:39.709 "data_size": 0 00:22:39.709 }, 00:22:39.709 { 00:22:39.709 "name": "BaseBdev4", 00:22:39.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.709 "is_configured": false, 00:22:39.709 "data_offset": 0, 00:22:39.709 "data_size": 0 00:22:39.709 } 00:22:39.709 ] 00:22:39.709 }' 00:22:39.709 20:36:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:39.709 20:36:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:40.279 20:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:40.538 [2024-07-15 20:36:32.666169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:40.538 BaseBdev2 00:22:40.538 20:36:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:40.538 20:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:40.538 20:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:40.538 20:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:40.538 20:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:40.538 20:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:40.538 20:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:40.798 20:36:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:40.798 [ 00:22:40.798 { 00:22:40.798 "name": "BaseBdev2", 00:22:40.798 "aliases": [ 00:22:40.798 "22ea0ec8-ab6a-45dd-b7ed-7a4d60a11a2a" 00:22:40.798 ], 00:22:40.798 "product_name": "Malloc disk", 00:22:40.798 "block_size": 512, 00:22:40.798 "num_blocks": 65536, 00:22:40.798 "uuid": "22ea0ec8-ab6a-45dd-b7ed-7a4d60a11a2a", 00:22:40.798 "assigned_rate_limits": { 00:22:40.798 "rw_ios_per_sec": 0, 00:22:40.798 "rw_mbytes_per_sec": 0, 00:22:40.798 "r_mbytes_per_sec": 0, 00:22:40.798 "w_mbytes_per_sec": 0 00:22:40.798 }, 00:22:40.798 "claimed": true, 00:22:40.798 "claim_type": "exclusive_write", 00:22:40.798 "zoned": false, 00:22:40.798 "supported_io_types": { 00:22:40.798 "read": true, 00:22:40.798 "write": true, 00:22:40.798 "unmap": true, 00:22:40.798 "flush": true, 00:22:40.798 "reset": true, 00:22:40.798 "nvme_admin": false, 00:22:40.798 "nvme_io": false, 00:22:40.798 "nvme_io_md": false, 00:22:40.798 "write_zeroes": true, 00:22:40.798 "zcopy": true, 00:22:40.798 "get_zone_info": false, 00:22:40.798 "zone_management": false, 00:22:40.798 "zone_append": false, 00:22:40.798 "compare": false, 00:22:40.798 "compare_and_write": false, 00:22:40.798 "abort": true, 00:22:40.798 "seek_hole": false, 00:22:40.798 "seek_data": false, 00:22:40.798 "copy": true, 00:22:40.798 "nvme_iov_md": false 00:22:40.798 }, 00:22:40.798 "memory_domains": [ 00:22:40.798 { 00:22:40.798 "dma_device_id": "system", 00:22:40.798 "dma_device_type": 1 00:22:40.798 }, 00:22:40.798 { 00:22:40.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.798 "dma_device_type": 2 00:22:40.798 } 00:22:40.798 ], 00:22:40.798 "driver_specific": {} 00:22:40.798 } 00:22:40.798 ] 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.058 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:41.319 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:41.319 "name": "Existed_Raid", 00:22:41.319 "uuid": "a9481c4d-5090-4cae-873b-ad2d8ad05588", 00:22:41.319 "strip_size_kb": 0, 00:22:41.319 "state": "configuring", 00:22:41.319 "raid_level": "raid1", 00:22:41.319 "superblock": true, 00:22:41.319 "num_base_bdevs": 4, 00:22:41.319 "num_base_bdevs_discovered": 2, 00:22:41.319 "num_base_bdevs_operational": 4, 00:22:41.319 "base_bdevs_list": [ 00:22:41.319 { 00:22:41.319 "name": "BaseBdev1", 00:22:41.319 "uuid": "e5589618-fd8c-4b04-b8c8-7e39fdab6f6b", 00:22:41.319 "is_configured": true, 00:22:41.319 "data_offset": 2048, 00:22:41.319 "data_size": 63488 00:22:41.319 }, 00:22:41.319 { 00:22:41.319 "name": "BaseBdev2", 00:22:41.319 "uuid": "22ea0ec8-ab6a-45dd-b7ed-7a4d60a11a2a", 00:22:41.319 "is_configured": true, 00:22:41.319 "data_offset": 2048, 00:22:41.319 "data_size": 63488 00:22:41.319 }, 00:22:41.319 { 00:22:41.319 "name": "BaseBdev3", 00:22:41.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.319 "is_configured": false, 00:22:41.319 "data_offset": 0, 00:22:41.319 "data_size": 0 00:22:41.319 }, 00:22:41.319 { 00:22:41.319 "name": "BaseBdev4", 00:22:41.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.319 "is_configured": false, 00:22:41.319 "data_offset": 0, 00:22:41.319 "data_size": 0 00:22:41.319 } 00:22:41.319 ] 00:22:41.319 }' 00:22:41.319 20:36:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:41.319 20:36:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:41.885 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:41.885 [2024-07-15 20:36:34.249786] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:41.885 BaseBdev3 00:22:42.145 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:42.145 20:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:42.145 20:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:42.145 20:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:42.145 20:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:42.145 20:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:42.145 20:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:42.145 20:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:42.405 [ 00:22:42.405 { 00:22:42.405 "name": "BaseBdev3", 00:22:42.405 "aliases": [ 00:22:42.405 "8f0b87aa-3f17-410f-9f86-1019d2009a46" 00:22:42.405 ], 00:22:42.405 "product_name": "Malloc disk", 00:22:42.405 "block_size": 512, 00:22:42.405 "num_blocks": 65536, 00:22:42.405 "uuid": "8f0b87aa-3f17-410f-9f86-1019d2009a46", 00:22:42.405 "assigned_rate_limits": { 00:22:42.405 "rw_ios_per_sec": 0, 00:22:42.405 "rw_mbytes_per_sec": 0, 00:22:42.405 "r_mbytes_per_sec": 0, 00:22:42.405 "w_mbytes_per_sec": 0 00:22:42.405 }, 00:22:42.405 "claimed": true, 00:22:42.405 "claim_type": "exclusive_write", 00:22:42.405 "zoned": false, 00:22:42.405 "supported_io_types": { 00:22:42.405 "read": true, 00:22:42.405 "write": true, 00:22:42.405 "unmap": true, 00:22:42.405 "flush": true, 00:22:42.405 "reset": true, 00:22:42.405 "nvme_admin": false, 00:22:42.405 "nvme_io": false, 00:22:42.405 "nvme_io_md": false, 00:22:42.405 "write_zeroes": true, 00:22:42.405 "zcopy": true, 00:22:42.405 "get_zone_info": false, 00:22:42.405 "zone_management": false, 00:22:42.405 "zone_append": false, 00:22:42.405 "compare": false, 00:22:42.405 "compare_and_write": false, 00:22:42.405 "abort": true, 00:22:42.405 "seek_hole": false, 00:22:42.405 "seek_data": false, 00:22:42.405 "copy": true, 00:22:42.405 "nvme_iov_md": false 00:22:42.405 }, 00:22:42.405 "memory_domains": [ 00:22:42.405 { 00:22:42.405 "dma_device_id": "system", 00:22:42.405 "dma_device_type": 1 00:22:42.405 }, 00:22:42.405 { 00:22:42.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.405 "dma_device_type": 2 00:22:42.405 } 00:22:42.405 ], 00:22:42.405 "driver_specific": {} 00:22:42.405 } 00:22:42.405 ] 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.405 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:42.665 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.665 "name": "Existed_Raid", 00:22:42.665 "uuid": "a9481c4d-5090-4cae-873b-ad2d8ad05588", 00:22:42.665 "strip_size_kb": 0, 00:22:42.665 "state": "configuring", 00:22:42.665 "raid_level": "raid1", 00:22:42.665 "superblock": true, 00:22:42.665 "num_base_bdevs": 4, 00:22:42.665 "num_base_bdevs_discovered": 3, 00:22:42.665 "num_base_bdevs_operational": 4, 00:22:42.665 "base_bdevs_list": [ 00:22:42.665 { 00:22:42.665 "name": "BaseBdev1", 00:22:42.665 "uuid": "e5589618-fd8c-4b04-b8c8-7e39fdab6f6b", 00:22:42.665 "is_configured": true, 00:22:42.665 "data_offset": 2048, 00:22:42.665 "data_size": 63488 00:22:42.665 }, 00:22:42.665 { 00:22:42.665 "name": "BaseBdev2", 00:22:42.665 "uuid": "22ea0ec8-ab6a-45dd-b7ed-7a4d60a11a2a", 00:22:42.665 "is_configured": true, 00:22:42.665 "data_offset": 2048, 00:22:42.665 "data_size": 63488 00:22:42.665 }, 00:22:42.665 { 00:22:42.665 "name": "BaseBdev3", 00:22:42.665 "uuid": "8f0b87aa-3f17-410f-9f86-1019d2009a46", 00:22:42.665 "is_configured": true, 00:22:42.665 "data_offset": 2048, 00:22:42.665 "data_size": 63488 00:22:42.665 }, 00:22:42.665 { 00:22:42.665 "name": "BaseBdev4", 00:22:42.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.665 "is_configured": false, 00:22:42.665 "data_offset": 0, 00:22:42.665 "data_size": 0 00:22:42.665 } 00:22:42.665 ] 00:22:42.665 }' 00:22:42.665 20:36:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.665 20:36:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:43.233 20:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:43.492 [2024-07-15 20:36:35.801301] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:43.492 [2024-07-15 20:36:35.801475] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2719350 00:22:43.492 [2024-07-15 20:36:35.801489] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:43.492 [2024-07-15 20:36:35.801669] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2719020 00:22:43.492 [2024-07-15 20:36:35.801793] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2719350 00:22:43.492 [2024-07-15 20:36:35.801803] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2719350 00:22:43.492 [2024-07-15 20:36:35.801895] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:43.492 BaseBdev4 00:22:43.492 20:36:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:43.492 20:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:43.492 20:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:43.492 20:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:43.492 20:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:43.492 20:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:43.492 20:36:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:43.751 20:36:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:44.011 [ 00:22:44.011 { 00:22:44.011 "name": "BaseBdev4", 00:22:44.011 "aliases": [ 00:22:44.011 "78677ebf-4560-47bd-9cd3-100d911c2d5d" 00:22:44.011 ], 00:22:44.011 "product_name": "Malloc disk", 00:22:44.011 "block_size": 512, 00:22:44.011 "num_blocks": 65536, 00:22:44.011 "uuid": "78677ebf-4560-47bd-9cd3-100d911c2d5d", 00:22:44.011 "assigned_rate_limits": { 00:22:44.011 "rw_ios_per_sec": 0, 00:22:44.011 "rw_mbytes_per_sec": 0, 00:22:44.011 "r_mbytes_per_sec": 0, 00:22:44.011 "w_mbytes_per_sec": 0 00:22:44.011 }, 00:22:44.011 "claimed": true, 00:22:44.011 "claim_type": "exclusive_write", 00:22:44.011 "zoned": false, 00:22:44.011 "supported_io_types": { 00:22:44.011 "read": true, 00:22:44.011 "write": true, 00:22:44.011 "unmap": true, 00:22:44.011 "flush": true, 00:22:44.011 "reset": true, 00:22:44.011 "nvme_admin": false, 00:22:44.011 "nvme_io": false, 00:22:44.011 "nvme_io_md": false, 00:22:44.011 "write_zeroes": true, 00:22:44.011 "zcopy": true, 00:22:44.011 "get_zone_info": false, 00:22:44.011 "zone_management": false, 00:22:44.011 "zone_append": false, 00:22:44.011 "compare": false, 00:22:44.011 "compare_and_write": false, 00:22:44.011 "abort": true, 00:22:44.011 "seek_hole": false, 00:22:44.011 "seek_data": false, 00:22:44.011 "copy": true, 00:22:44.011 "nvme_iov_md": false 00:22:44.011 }, 00:22:44.011 "memory_domains": [ 00:22:44.011 { 00:22:44.011 "dma_device_id": "system", 00:22:44.011 "dma_device_type": 1 00:22:44.011 }, 00:22:44.011 { 00:22:44.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.011 "dma_device_type": 2 00:22:44.011 } 00:22:44.011 ], 00:22:44.011 "driver_specific": {} 00:22:44.011 } 00:22:44.011 ] 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.011 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:44.270 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.270 "name": "Existed_Raid", 00:22:44.270 "uuid": "a9481c4d-5090-4cae-873b-ad2d8ad05588", 00:22:44.270 "strip_size_kb": 0, 00:22:44.270 "state": "online", 00:22:44.270 "raid_level": "raid1", 00:22:44.270 "superblock": true, 00:22:44.270 "num_base_bdevs": 4, 00:22:44.270 "num_base_bdevs_discovered": 4, 00:22:44.270 "num_base_bdevs_operational": 4, 00:22:44.270 "base_bdevs_list": [ 00:22:44.270 { 00:22:44.270 "name": "BaseBdev1", 00:22:44.270 "uuid": "e5589618-fd8c-4b04-b8c8-7e39fdab6f6b", 00:22:44.270 "is_configured": true, 00:22:44.270 "data_offset": 2048, 00:22:44.270 "data_size": 63488 00:22:44.270 }, 00:22:44.270 { 00:22:44.270 "name": "BaseBdev2", 00:22:44.270 "uuid": "22ea0ec8-ab6a-45dd-b7ed-7a4d60a11a2a", 00:22:44.270 "is_configured": true, 00:22:44.270 "data_offset": 2048, 00:22:44.270 "data_size": 63488 00:22:44.270 }, 00:22:44.270 { 00:22:44.270 "name": "BaseBdev3", 00:22:44.270 "uuid": "8f0b87aa-3f17-410f-9f86-1019d2009a46", 00:22:44.270 "is_configured": true, 00:22:44.270 "data_offset": 2048, 00:22:44.270 "data_size": 63488 00:22:44.270 }, 00:22:44.270 { 00:22:44.270 "name": "BaseBdev4", 00:22:44.270 "uuid": "78677ebf-4560-47bd-9cd3-100d911c2d5d", 00:22:44.270 "is_configured": true, 00:22:44.270 "data_offset": 2048, 00:22:44.270 "data_size": 63488 00:22:44.270 } 00:22:44.270 ] 00:22:44.270 }' 00:22:44.270 20:36:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.270 20:36:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:44.837 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:44.837 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:44.837 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:44.837 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:44.837 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:44.837 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:44.837 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:44.837 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:45.095 [2024-07-15 20:36:37.281579] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:45.095 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:45.095 "name": "Existed_Raid", 00:22:45.095 "aliases": [ 00:22:45.095 "a9481c4d-5090-4cae-873b-ad2d8ad05588" 00:22:45.095 ], 00:22:45.095 "product_name": "Raid Volume", 00:22:45.095 "block_size": 512, 00:22:45.095 "num_blocks": 63488, 00:22:45.095 "uuid": "a9481c4d-5090-4cae-873b-ad2d8ad05588", 00:22:45.095 "assigned_rate_limits": { 00:22:45.095 "rw_ios_per_sec": 0, 00:22:45.095 "rw_mbytes_per_sec": 0, 00:22:45.095 "r_mbytes_per_sec": 0, 00:22:45.095 "w_mbytes_per_sec": 0 00:22:45.095 }, 00:22:45.095 "claimed": false, 00:22:45.095 "zoned": false, 00:22:45.095 "supported_io_types": { 00:22:45.095 "read": true, 00:22:45.095 "write": true, 00:22:45.095 "unmap": false, 00:22:45.095 "flush": false, 00:22:45.095 "reset": true, 00:22:45.095 "nvme_admin": false, 00:22:45.095 "nvme_io": false, 00:22:45.095 "nvme_io_md": false, 00:22:45.095 "write_zeroes": true, 00:22:45.095 "zcopy": false, 00:22:45.095 "get_zone_info": false, 00:22:45.095 "zone_management": false, 00:22:45.095 "zone_append": false, 00:22:45.095 "compare": false, 00:22:45.095 "compare_and_write": false, 00:22:45.095 "abort": false, 00:22:45.095 "seek_hole": false, 00:22:45.095 "seek_data": false, 00:22:45.095 "copy": false, 00:22:45.095 "nvme_iov_md": false 00:22:45.095 }, 00:22:45.095 "memory_domains": [ 00:22:45.095 { 00:22:45.095 "dma_device_id": "system", 00:22:45.095 "dma_device_type": 1 00:22:45.095 }, 00:22:45.095 { 00:22:45.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.095 "dma_device_type": 2 00:22:45.095 }, 00:22:45.095 { 00:22:45.095 "dma_device_id": "system", 00:22:45.095 "dma_device_type": 1 00:22:45.095 }, 00:22:45.095 { 00:22:45.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.095 "dma_device_type": 2 00:22:45.095 }, 00:22:45.095 { 00:22:45.095 "dma_device_id": "system", 00:22:45.095 "dma_device_type": 1 00:22:45.095 }, 00:22:45.095 { 00:22:45.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.095 "dma_device_type": 2 00:22:45.095 }, 00:22:45.095 { 00:22:45.095 "dma_device_id": "system", 00:22:45.095 "dma_device_type": 1 00:22:45.095 }, 00:22:45.095 { 00:22:45.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.095 "dma_device_type": 2 00:22:45.095 } 00:22:45.095 ], 00:22:45.095 "driver_specific": { 00:22:45.095 "raid": { 00:22:45.095 "uuid": "a9481c4d-5090-4cae-873b-ad2d8ad05588", 00:22:45.095 "strip_size_kb": 0, 00:22:45.095 "state": "online", 00:22:45.095 "raid_level": "raid1", 00:22:45.095 "superblock": true, 00:22:45.095 "num_base_bdevs": 4, 00:22:45.095 "num_base_bdevs_discovered": 4, 00:22:45.095 "num_base_bdevs_operational": 4, 00:22:45.095 "base_bdevs_list": [ 00:22:45.095 { 00:22:45.095 "name": "BaseBdev1", 00:22:45.095 "uuid": "e5589618-fd8c-4b04-b8c8-7e39fdab6f6b", 00:22:45.095 "is_configured": true, 00:22:45.095 "data_offset": 2048, 00:22:45.095 "data_size": 63488 00:22:45.095 }, 00:22:45.095 { 00:22:45.095 "name": "BaseBdev2", 00:22:45.095 "uuid": "22ea0ec8-ab6a-45dd-b7ed-7a4d60a11a2a", 00:22:45.095 "is_configured": true, 00:22:45.095 "data_offset": 2048, 00:22:45.095 "data_size": 63488 00:22:45.095 }, 00:22:45.095 { 00:22:45.095 "name": "BaseBdev3", 00:22:45.095 "uuid": "8f0b87aa-3f17-410f-9f86-1019d2009a46", 00:22:45.095 "is_configured": true, 00:22:45.095 "data_offset": 2048, 00:22:45.095 "data_size": 63488 00:22:45.095 }, 00:22:45.095 { 00:22:45.095 "name": "BaseBdev4", 00:22:45.095 "uuid": "78677ebf-4560-47bd-9cd3-100d911c2d5d", 00:22:45.095 "is_configured": true, 00:22:45.095 "data_offset": 2048, 00:22:45.095 "data_size": 63488 00:22:45.095 } 00:22:45.095 ] 00:22:45.095 } 00:22:45.095 } 00:22:45.095 }' 00:22:45.095 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:45.095 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:45.095 BaseBdev2 00:22:45.095 BaseBdev3 00:22:45.095 BaseBdev4' 00:22:45.095 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:45.095 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:45.095 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:45.353 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:45.353 "name": "BaseBdev1", 00:22:45.353 "aliases": [ 00:22:45.353 "e5589618-fd8c-4b04-b8c8-7e39fdab6f6b" 00:22:45.353 ], 00:22:45.353 "product_name": "Malloc disk", 00:22:45.353 "block_size": 512, 00:22:45.353 "num_blocks": 65536, 00:22:45.353 "uuid": "e5589618-fd8c-4b04-b8c8-7e39fdab6f6b", 00:22:45.353 "assigned_rate_limits": { 00:22:45.353 "rw_ios_per_sec": 0, 00:22:45.353 "rw_mbytes_per_sec": 0, 00:22:45.353 "r_mbytes_per_sec": 0, 00:22:45.353 "w_mbytes_per_sec": 0 00:22:45.353 }, 00:22:45.353 "claimed": true, 00:22:45.353 "claim_type": "exclusive_write", 00:22:45.353 "zoned": false, 00:22:45.353 "supported_io_types": { 00:22:45.353 "read": true, 00:22:45.353 "write": true, 00:22:45.353 "unmap": true, 00:22:45.353 "flush": true, 00:22:45.353 "reset": true, 00:22:45.353 "nvme_admin": false, 00:22:45.353 "nvme_io": false, 00:22:45.353 "nvme_io_md": false, 00:22:45.353 "write_zeroes": true, 00:22:45.353 "zcopy": true, 00:22:45.353 "get_zone_info": false, 00:22:45.353 "zone_management": false, 00:22:45.353 "zone_append": false, 00:22:45.353 "compare": false, 00:22:45.353 "compare_and_write": false, 00:22:45.353 "abort": true, 00:22:45.353 "seek_hole": false, 00:22:45.353 "seek_data": false, 00:22:45.353 "copy": true, 00:22:45.353 "nvme_iov_md": false 00:22:45.353 }, 00:22:45.353 "memory_domains": [ 00:22:45.353 { 00:22:45.353 "dma_device_id": "system", 00:22:45.353 "dma_device_type": 1 00:22:45.353 }, 00:22:45.353 { 00:22:45.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.353 "dma_device_type": 2 00:22:45.353 } 00:22:45.353 ], 00:22:45.353 "driver_specific": {} 00:22:45.353 }' 00:22:45.353 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.353 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.353 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:45.353 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:45.612 20:36:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:45.870 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:45.870 "name": "BaseBdev2", 00:22:45.870 "aliases": [ 00:22:45.870 "22ea0ec8-ab6a-45dd-b7ed-7a4d60a11a2a" 00:22:45.870 ], 00:22:45.870 "product_name": "Malloc disk", 00:22:45.870 "block_size": 512, 00:22:45.870 "num_blocks": 65536, 00:22:45.870 "uuid": "22ea0ec8-ab6a-45dd-b7ed-7a4d60a11a2a", 00:22:45.870 "assigned_rate_limits": { 00:22:45.870 "rw_ios_per_sec": 0, 00:22:45.870 "rw_mbytes_per_sec": 0, 00:22:45.870 "r_mbytes_per_sec": 0, 00:22:45.870 "w_mbytes_per_sec": 0 00:22:45.870 }, 00:22:45.870 "claimed": true, 00:22:45.870 "claim_type": "exclusive_write", 00:22:45.870 "zoned": false, 00:22:45.870 "supported_io_types": { 00:22:45.870 "read": true, 00:22:45.870 "write": true, 00:22:45.870 "unmap": true, 00:22:45.870 "flush": true, 00:22:45.870 "reset": true, 00:22:45.870 "nvme_admin": false, 00:22:45.870 "nvme_io": false, 00:22:45.870 "nvme_io_md": false, 00:22:45.870 "write_zeroes": true, 00:22:45.870 "zcopy": true, 00:22:45.870 "get_zone_info": false, 00:22:45.870 "zone_management": false, 00:22:45.870 "zone_append": false, 00:22:45.870 "compare": false, 00:22:45.870 "compare_and_write": false, 00:22:45.870 "abort": true, 00:22:45.870 "seek_hole": false, 00:22:45.870 "seek_data": false, 00:22:45.870 "copy": true, 00:22:45.870 "nvme_iov_md": false 00:22:45.870 }, 00:22:45.870 "memory_domains": [ 00:22:45.870 { 00:22:45.871 "dma_device_id": "system", 00:22:45.871 "dma_device_type": 1 00:22:45.871 }, 00:22:45.871 { 00:22:45.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.871 "dma_device_type": 2 00:22:45.871 } 00:22:45.871 ], 00:22:45.871 "driver_specific": {} 00:22:45.871 }' 00:22:45.871 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.129 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.129 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:46.129 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.129 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.129 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:46.129 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.129 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.129 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:46.129 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.388 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.388 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:46.388 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:46.388 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:46.388 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:46.647 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:46.647 "name": "BaseBdev3", 00:22:46.647 "aliases": [ 00:22:46.647 "8f0b87aa-3f17-410f-9f86-1019d2009a46" 00:22:46.647 ], 00:22:46.647 "product_name": "Malloc disk", 00:22:46.647 "block_size": 512, 00:22:46.647 "num_blocks": 65536, 00:22:46.647 "uuid": "8f0b87aa-3f17-410f-9f86-1019d2009a46", 00:22:46.647 "assigned_rate_limits": { 00:22:46.647 "rw_ios_per_sec": 0, 00:22:46.647 "rw_mbytes_per_sec": 0, 00:22:46.647 "r_mbytes_per_sec": 0, 00:22:46.647 "w_mbytes_per_sec": 0 00:22:46.647 }, 00:22:46.647 "claimed": true, 00:22:46.647 "claim_type": "exclusive_write", 00:22:46.647 "zoned": false, 00:22:46.647 "supported_io_types": { 00:22:46.647 "read": true, 00:22:46.647 "write": true, 00:22:46.647 "unmap": true, 00:22:46.647 "flush": true, 00:22:46.647 "reset": true, 00:22:46.647 "nvme_admin": false, 00:22:46.647 "nvme_io": false, 00:22:46.647 "nvme_io_md": false, 00:22:46.647 "write_zeroes": true, 00:22:46.647 "zcopy": true, 00:22:46.647 "get_zone_info": false, 00:22:46.647 "zone_management": false, 00:22:46.647 "zone_append": false, 00:22:46.647 "compare": false, 00:22:46.647 "compare_and_write": false, 00:22:46.647 "abort": true, 00:22:46.647 "seek_hole": false, 00:22:46.647 "seek_data": false, 00:22:46.647 "copy": true, 00:22:46.647 "nvme_iov_md": false 00:22:46.647 }, 00:22:46.647 "memory_domains": [ 00:22:46.647 { 00:22:46.647 "dma_device_id": "system", 00:22:46.647 "dma_device_type": 1 00:22:46.647 }, 00:22:46.647 { 00:22:46.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.647 "dma_device_type": 2 00:22:46.647 } 00:22:46.647 ], 00:22:46.647 "driver_specific": {} 00:22:46.647 }' 00:22:46.647 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.647 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.647 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:46.647 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.647 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.647 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:46.647 20:36:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.906 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.906 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:46.906 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.906 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.906 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:46.906 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:46.906 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:46.906 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:47.165 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:47.165 "name": "BaseBdev4", 00:22:47.165 "aliases": [ 00:22:47.165 "78677ebf-4560-47bd-9cd3-100d911c2d5d" 00:22:47.165 ], 00:22:47.165 "product_name": "Malloc disk", 00:22:47.165 "block_size": 512, 00:22:47.165 "num_blocks": 65536, 00:22:47.165 "uuid": "78677ebf-4560-47bd-9cd3-100d911c2d5d", 00:22:47.165 "assigned_rate_limits": { 00:22:47.165 "rw_ios_per_sec": 0, 00:22:47.165 "rw_mbytes_per_sec": 0, 00:22:47.165 "r_mbytes_per_sec": 0, 00:22:47.165 "w_mbytes_per_sec": 0 00:22:47.165 }, 00:22:47.165 "claimed": true, 00:22:47.165 "claim_type": "exclusive_write", 00:22:47.165 "zoned": false, 00:22:47.165 "supported_io_types": { 00:22:47.165 "read": true, 00:22:47.165 "write": true, 00:22:47.165 "unmap": true, 00:22:47.165 "flush": true, 00:22:47.165 "reset": true, 00:22:47.165 "nvme_admin": false, 00:22:47.165 "nvme_io": false, 00:22:47.165 "nvme_io_md": false, 00:22:47.165 "write_zeroes": true, 00:22:47.165 "zcopy": true, 00:22:47.165 "get_zone_info": false, 00:22:47.165 "zone_management": false, 00:22:47.165 "zone_append": false, 00:22:47.165 "compare": false, 00:22:47.165 "compare_and_write": false, 00:22:47.165 "abort": true, 00:22:47.165 "seek_hole": false, 00:22:47.165 "seek_data": false, 00:22:47.165 "copy": true, 00:22:47.165 "nvme_iov_md": false 00:22:47.165 }, 00:22:47.165 "memory_domains": [ 00:22:47.165 { 00:22:47.165 "dma_device_id": "system", 00:22:47.165 "dma_device_type": 1 00:22:47.165 }, 00:22:47.165 { 00:22:47.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:47.165 "dma_device_type": 2 00:22:47.165 } 00:22:47.165 ], 00:22:47.165 "driver_specific": {} 00:22:47.165 }' 00:22:47.165 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:47.165 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:47.165 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:47.165 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:47.423 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:47.423 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:47.423 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:47.423 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:47.423 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:47.423 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:47.423 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:47.682 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:47.682 20:36:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:47.941 [2024-07-15 20:36:40.064739] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.941 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:48.200 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:48.200 "name": "Existed_Raid", 00:22:48.200 "uuid": "a9481c4d-5090-4cae-873b-ad2d8ad05588", 00:22:48.200 "strip_size_kb": 0, 00:22:48.200 "state": "online", 00:22:48.200 "raid_level": "raid1", 00:22:48.200 "superblock": true, 00:22:48.200 "num_base_bdevs": 4, 00:22:48.200 "num_base_bdevs_discovered": 3, 00:22:48.200 "num_base_bdevs_operational": 3, 00:22:48.200 "base_bdevs_list": [ 00:22:48.200 { 00:22:48.200 "name": null, 00:22:48.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:48.200 "is_configured": false, 00:22:48.200 "data_offset": 2048, 00:22:48.200 "data_size": 63488 00:22:48.200 }, 00:22:48.200 { 00:22:48.200 "name": "BaseBdev2", 00:22:48.200 "uuid": "22ea0ec8-ab6a-45dd-b7ed-7a4d60a11a2a", 00:22:48.200 "is_configured": true, 00:22:48.200 "data_offset": 2048, 00:22:48.200 "data_size": 63488 00:22:48.200 }, 00:22:48.200 { 00:22:48.200 "name": "BaseBdev3", 00:22:48.200 "uuid": "8f0b87aa-3f17-410f-9f86-1019d2009a46", 00:22:48.200 "is_configured": true, 00:22:48.200 "data_offset": 2048, 00:22:48.200 "data_size": 63488 00:22:48.200 }, 00:22:48.200 { 00:22:48.200 "name": "BaseBdev4", 00:22:48.200 "uuid": "78677ebf-4560-47bd-9cd3-100d911c2d5d", 00:22:48.200 "is_configured": true, 00:22:48.200 "data_offset": 2048, 00:22:48.200 "data_size": 63488 00:22:48.200 } 00:22:48.200 ] 00:22:48.200 }' 00:22:48.200 20:36:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:48.200 20:36:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:48.767 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:48.767 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:48.767 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.767 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:49.026 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:49.026 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:49.026 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:49.285 [2024-07-15 20:36:41.526472] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:49.285 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:49.285 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:49.285 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.285 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:49.543 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:49.543 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:49.543 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:49.543 [2024-07-15 20:36:41.889249] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:49.802 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:49.802 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:49.802 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.802 20:36:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:49.802 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:49.802 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:49.802 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:50.061 [2024-07-15 20:36:42.328018] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:50.061 [2024-07-15 20:36:42.328116] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:50.061 [2024-07-15 20:36:42.348063] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:50.061 [2024-07-15 20:36:42.348093] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:50.061 [2024-07-15 20:36:42.348104] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2719350 name Existed_Raid, state offline 00:22:50.061 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:50.061 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:50.061 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.061 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:50.319 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:50.319 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:50.319 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:50.319 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:50.319 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:50.319 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:50.578 BaseBdev2 00:22:50.579 20:36:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:50.579 20:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:50.579 20:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:50.579 20:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:50.579 20:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:50.579 20:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:50.579 20:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:50.579 20:36:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:50.837 [ 00:22:50.837 { 00:22:50.837 "name": "BaseBdev2", 00:22:50.837 "aliases": [ 00:22:50.837 "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5" 00:22:50.837 ], 00:22:50.837 "product_name": "Malloc disk", 00:22:50.837 "block_size": 512, 00:22:50.837 "num_blocks": 65536, 00:22:50.837 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:22:50.837 "assigned_rate_limits": { 00:22:50.837 "rw_ios_per_sec": 0, 00:22:50.837 "rw_mbytes_per_sec": 0, 00:22:50.837 "r_mbytes_per_sec": 0, 00:22:50.837 "w_mbytes_per_sec": 0 00:22:50.837 }, 00:22:50.837 "claimed": false, 00:22:50.837 "zoned": false, 00:22:50.837 "supported_io_types": { 00:22:50.837 "read": true, 00:22:50.837 "write": true, 00:22:50.837 "unmap": true, 00:22:50.837 "flush": true, 00:22:50.837 "reset": true, 00:22:50.837 "nvme_admin": false, 00:22:50.837 "nvme_io": false, 00:22:50.837 "nvme_io_md": false, 00:22:50.837 "write_zeroes": true, 00:22:50.837 "zcopy": true, 00:22:50.837 "get_zone_info": false, 00:22:50.837 "zone_management": false, 00:22:50.837 "zone_append": false, 00:22:50.837 "compare": false, 00:22:50.837 "compare_and_write": false, 00:22:50.837 "abort": true, 00:22:50.837 "seek_hole": false, 00:22:50.837 "seek_data": false, 00:22:50.837 "copy": true, 00:22:50.837 "nvme_iov_md": false 00:22:50.837 }, 00:22:50.837 "memory_domains": [ 00:22:50.837 { 00:22:50.837 "dma_device_id": "system", 00:22:50.837 "dma_device_type": 1 00:22:50.837 }, 00:22:50.837 { 00:22:50.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.837 "dma_device_type": 2 00:22:50.837 } 00:22:50.837 ], 00:22:50.837 "driver_specific": {} 00:22:50.837 } 00:22:50.837 ] 00:22:50.838 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:50.838 20:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:50.838 20:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:50.838 20:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:51.095 BaseBdev3 00:22:51.095 20:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:51.095 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:51.095 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:51.095 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:51.095 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:51.095 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:51.095 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:51.095 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:51.354 [ 00:22:51.354 { 00:22:51.354 "name": "BaseBdev3", 00:22:51.354 "aliases": [ 00:22:51.354 "281e7b86-6f77-4d0e-9944-7b0654b9120f" 00:22:51.354 ], 00:22:51.354 "product_name": "Malloc disk", 00:22:51.354 "block_size": 512, 00:22:51.354 "num_blocks": 65536, 00:22:51.354 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:22:51.354 "assigned_rate_limits": { 00:22:51.354 "rw_ios_per_sec": 0, 00:22:51.354 "rw_mbytes_per_sec": 0, 00:22:51.354 "r_mbytes_per_sec": 0, 00:22:51.354 "w_mbytes_per_sec": 0 00:22:51.354 }, 00:22:51.354 "claimed": false, 00:22:51.354 "zoned": false, 00:22:51.354 "supported_io_types": { 00:22:51.354 "read": true, 00:22:51.354 "write": true, 00:22:51.354 "unmap": true, 00:22:51.354 "flush": true, 00:22:51.354 "reset": true, 00:22:51.354 "nvme_admin": false, 00:22:51.354 "nvme_io": false, 00:22:51.354 "nvme_io_md": false, 00:22:51.354 "write_zeroes": true, 00:22:51.354 "zcopy": true, 00:22:51.354 "get_zone_info": false, 00:22:51.354 "zone_management": false, 00:22:51.354 "zone_append": false, 00:22:51.354 "compare": false, 00:22:51.354 "compare_and_write": false, 00:22:51.354 "abort": true, 00:22:51.354 "seek_hole": false, 00:22:51.354 "seek_data": false, 00:22:51.354 "copy": true, 00:22:51.354 "nvme_iov_md": false 00:22:51.354 }, 00:22:51.354 "memory_domains": [ 00:22:51.354 { 00:22:51.354 "dma_device_id": "system", 00:22:51.354 "dma_device_type": 1 00:22:51.354 }, 00:22:51.354 { 00:22:51.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.354 "dma_device_type": 2 00:22:51.354 } 00:22:51.354 ], 00:22:51.354 "driver_specific": {} 00:22:51.354 } 00:22:51.354 ] 00:22:51.354 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:51.354 20:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:51.354 20:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:51.354 20:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:51.614 BaseBdev4 00:22:51.614 20:36:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:51.614 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:51.614 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:51.614 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:51.614 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:51.614 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:51.614 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:51.614 20:36:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:51.873 [ 00:22:51.873 { 00:22:51.873 "name": "BaseBdev4", 00:22:51.873 "aliases": [ 00:22:51.873 "69802234-6ec7-4d84-a79f-d616937fadd6" 00:22:51.873 ], 00:22:51.873 "product_name": "Malloc disk", 00:22:51.873 "block_size": 512, 00:22:51.873 "num_blocks": 65536, 00:22:51.873 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:22:51.873 "assigned_rate_limits": { 00:22:51.873 "rw_ios_per_sec": 0, 00:22:51.873 "rw_mbytes_per_sec": 0, 00:22:51.873 "r_mbytes_per_sec": 0, 00:22:51.873 "w_mbytes_per_sec": 0 00:22:51.873 }, 00:22:51.873 "claimed": false, 00:22:51.873 "zoned": false, 00:22:51.873 "supported_io_types": { 00:22:51.873 "read": true, 00:22:51.873 "write": true, 00:22:51.873 "unmap": true, 00:22:51.873 "flush": true, 00:22:51.873 "reset": true, 00:22:51.873 "nvme_admin": false, 00:22:51.873 "nvme_io": false, 00:22:51.873 "nvme_io_md": false, 00:22:51.873 "write_zeroes": true, 00:22:51.873 "zcopy": true, 00:22:51.873 "get_zone_info": false, 00:22:51.873 "zone_management": false, 00:22:51.873 "zone_append": false, 00:22:51.873 "compare": false, 00:22:51.873 "compare_and_write": false, 00:22:51.873 "abort": true, 00:22:51.873 "seek_hole": false, 00:22:51.873 "seek_data": false, 00:22:51.873 "copy": true, 00:22:51.874 "nvme_iov_md": false 00:22:51.874 }, 00:22:51.874 "memory_domains": [ 00:22:51.874 { 00:22:51.874 "dma_device_id": "system", 00:22:51.874 "dma_device_type": 1 00:22:51.874 }, 00:22:51.874 { 00:22:51.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.874 "dma_device_type": 2 00:22:51.874 } 00:22:51.874 ], 00:22:51.874 "driver_specific": {} 00:22:51.874 } 00:22:51.874 ] 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:51.874 [2024-07-15 20:36:44.233068] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:51.874 [2024-07-15 20:36:44.233119] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:51.874 [2024-07-15 20:36:44.233138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:51.874 [2024-07-15 20:36:44.234750] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:51.874 [2024-07-15 20:36:44.234800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.874 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.132 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.132 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:52.132 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.132 "name": "Existed_Raid", 00:22:52.132 "uuid": "941a1b64-cd10-4566-b749-64ede524de1d", 00:22:52.132 "strip_size_kb": 0, 00:22:52.132 "state": "configuring", 00:22:52.132 "raid_level": "raid1", 00:22:52.132 "superblock": true, 00:22:52.132 "num_base_bdevs": 4, 00:22:52.132 "num_base_bdevs_discovered": 3, 00:22:52.132 "num_base_bdevs_operational": 4, 00:22:52.132 "base_bdevs_list": [ 00:22:52.132 { 00:22:52.132 "name": "BaseBdev1", 00:22:52.132 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.132 "is_configured": false, 00:22:52.132 "data_offset": 0, 00:22:52.132 "data_size": 0 00:22:52.132 }, 00:22:52.132 { 00:22:52.132 "name": "BaseBdev2", 00:22:52.132 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:22:52.132 "is_configured": true, 00:22:52.132 "data_offset": 2048, 00:22:52.132 "data_size": 63488 00:22:52.132 }, 00:22:52.132 { 00:22:52.132 "name": "BaseBdev3", 00:22:52.132 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:22:52.132 "is_configured": true, 00:22:52.132 "data_offset": 2048, 00:22:52.132 "data_size": 63488 00:22:52.132 }, 00:22:52.132 { 00:22:52.132 "name": "BaseBdev4", 00:22:52.132 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:22:52.132 "is_configured": true, 00:22:52.132 "data_offset": 2048, 00:22:52.132 "data_size": 63488 00:22:52.132 } 00:22:52.132 ] 00:22:52.132 }' 00:22:52.132 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.132 20:36:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:52.713 20:36:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:52.713 [2024-07-15 20:36:45.059210] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:52.713 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:52.713 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:52.713 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:52.713 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:52.713 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:52.713 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:52.713 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:52.713 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:52.713 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:52.713 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.714 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.714 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:52.972 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.972 "name": "Existed_Raid", 00:22:52.972 "uuid": "941a1b64-cd10-4566-b749-64ede524de1d", 00:22:52.972 "strip_size_kb": 0, 00:22:52.972 "state": "configuring", 00:22:52.972 "raid_level": "raid1", 00:22:52.972 "superblock": true, 00:22:52.972 "num_base_bdevs": 4, 00:22:52.972 "num_base_bdevs_discovered": 2, 00:22:52.972 "num_base_bdevs_operational": 4, 00:22:52.972 "base_bdevs_list": [ 00:22:52.972 { 00:22:52.972 "name": "BaseBdev1", 00:22:52.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.972 "is_configured": false, 00:22:52.972 "data_offset": 0, 00:22:52.972 "data_size": 0 00:22:52.972 }, 00:22:52.972 { 00:22:52.972 "name": null, 00:22:52.972 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:22:52.972 "is_configured": false, 00:22:52.972 "data_offset": 2048, 00:22:52.972 "data_size": 63488 00:22:52.972 }, 00:22:52.972 { 00:22:52.972 "name": "BaseBdev3", 00:22:52.972 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:22:52.972 "is_configured": true, 00:22:52.972 "data_offset": 2048, 00:22:52.972 "data_size": 63488 00:22:52.972 }, 00:22:52.972 { 00:22:52.972 "name": "BaseBdev4", 00:22:52.972 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:22:52.972 "is_configured": true, 00:22:52.972 "data_offset": 2048, 00:22:52.972 "data_size": 63488 00:22:52.972 } 00:22:52.972 ] 00:22:52.972 }' 00:22:52.972 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.972 20:36:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:53.538 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.538 20:36:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:53.796 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:53.797 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:54.055 [2024-07-15 20:36:46.335768] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:54.055 BaseBdev1 00:22:54.055 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:54.055 20:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:54.055 20:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:54.055 20:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:54.055 20:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:54.055 20:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:54.055 20:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:54.344 20:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:54.601 [ 00:22:54.601 { 00:22:54.601 "name": "BaseBdev1", 00:22:54.601 "aliases": [ 00:22:54.601 "67fd00aa-7447-4b48-96cd-6d46ac06d94f" 00:22:54.601 ], 00:22:54.601 "product_name": "Malloc disk", 00:22:54.601 "block_size": 512, 00:22:54.601 "num_blocks": 65536, 00:22:54.601 "uuid": "67fd00aa-7447-4b48-96cd-6d46ac06d94f", 00:22:54.601 "assigned_rate_limits": { 00:22:54.601 "rw_ios_per_sec": 0, 00:22:54.601 "rw_mbytes_per_sec": 0, 00:22:54.601 "r_mbytes_per_sec": 0, 00:22:54.601 "w_mbytes_per_sec": 0 00:22:54.601 }, 00:22:54.601 "claimed": true, 00:22:54.601 "claim_type": "exclusive_write", 00:22:54.601 "zoned": false, 00:22:54.601 "supported_io_types": { 00:22:54.601 "read": true, 00:22:54.601 "write": true, 00:22:54.601 "unmap": true, 00:22:54.601 "flush": true, 00:22:54.601 "reset": true, 00:22:54.601 "nvme_admin": false, 00:22:54.601 "nvme_io": false, 00:22:54.601 "nvme_io_md": false, 00:22:54.601 "write_zeroes": true, 00:22:54.601 "zcopy": true, 00:22:54.601 "get_zone_info": false, 00:22:54.601 "zone_management": false, 00:22:54.601 "zone_append": false, 00:22:54.601 "compare": false, 00:22:54.601 "compare_and_write": false, 00:22:54.601 "abort": true, 00:22:54.601 "seek_hole": false, 00:22:54.601 "seek_data": false, 00:22:54.601 "copy": true, 00:22:54.601 "nvme_iov_md": false 00:22:54.601 }, 00:22:54.601 "memory_domains": [ 00:22:54.601 { 00:22:54.601 "dma_device_id": "system", 00:22:54.601 "dma_device_type": 1 00:22:54.601 }, 00:22:54.601 { 00:22:54.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.601 "dma_device_type": 2 00:22:54.601 } 00:22:54.601 ], 00:22:54.601 "driver_specific": {} 00:22:54.601 } 00:22:54.601 ] 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.601 20:36:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:54.859 20:36:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:54.859 "name": "Existed_Raid", 00:22:54.859 "uuid": "941a1b64-cd10-4566-b749-64ede524de1d", 00:22:54.859 "strip_size_kb": 0, 00:22:54.859 "state": "configuring", 00:22:54.859 "raid_level": "raid1", 00:22:54.859 "superblock": true, 00:22:54.859 "num_base_bdevs": 4, 00:22:54.859 "num_base_bdevs_discovered": 3, 00:22:54.859 "num_base_bdevs_operational": 4, 00:22:54.859 "base_bdevs_list": [ 00:22:54.859 { 00:22:54.859 "name": "BaseBdev1", 00:22:54.859 "uuid": "67fd00aa-7447-4b48-96cd-6d46ac06d94f", 00:22:54.859 "is_configured": true, 00:22:54.859 "data_offset": 2048, 00:22:54.859 "data_size": 63488 00:22:54.859 }, 00:22:54.859 { 00:22:54.859 "name": null, 00:22:54.859 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:22:54.859 "is_configured": false, 00:22:54.859 "data_offset": 2048, 00:22:54.859 "data_size": 63488 00:22:54.859 }, 00:22:54.859 { 00:22:54.859 "name": "BaseBdev3", 00:22:54.859 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:22:54.859 "is_configured": true, 00:22:54.859 "data_offset": 2048, 00:22:54.859 "data_size": 63488 00:22:54.859 }, 00:22:54.859 { 00:22:54.859 "name": "BaseBdev4", 00:22:54.859 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:22:54.859 "is_configured": true, 00:22:54.859 "data_offset": 2048, 00:22:54.859 "data_size": 63488 00:22:54.859 } 00:22:54.859 ] 00:22:54.859 }' 00:22:54.859 20:36:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:54.859 20:36:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:55.424 20:36:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.424 20:36:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:55.683 20:36:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:55.683 20:36:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:55.942 [2024-07-15 20:36:48.080435] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.942 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:56.201 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.201 "name": "Existed_Raid", 00:22:56.201 "uuid": "941a1b64-cd10-4566-b749-64ede524de1d", 00:22:56.201 "strip_size_kb": 0, 00:22:56.201 "state": "configuring", 00:22:56.201 "raid_level": "raid1", 00:22:56.201 "superblock": true, 00:22:56.201 "num_base_bdevs": 4, 00:22:56.201 "num_base_bdevs_discovered": 2, 00:22:56.201 "num_base_bdevs_operational": 4, 00:22:56.201 "base_bdevs_list": [ 00:22:56.201 { 00:22:56.201 "name": "BaseBdev1", 00:22:56.201 "uuid": "67fd00aa-7447-4b48-96cd-6d46ac06d94f", 00:22:56.201 "is_configured": true, 00:22:56.201 "data_offset": 2048, 00:22:56.201 "data_size": 63488 00:22:56.201 }, 00:22:56.201 { 00:22:56.201 "name": null, 00:22:56.201 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:22:56.201 "is_configured": false, 00:22:56.201 "data_offset": 2048, 00:22:56.201 "data_size": 63488 00:22:56.201 }, 00:22:56.201 { 00:22:56.201 "name": null, 00:22:56.201 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:22:56.201 "is_configured": false, 00:22:56.201 "data_offset": 2048, 00:22:56.201 "data_size": 63488 00:22:56.201 }, 00:22:56.201 { 00:22:56.201 "name": "BaseBdev4", 00:22:56.201 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:22:56.201 "is_configured": true, 00:22:56.201 "data_offset": 2048, 00:22:56.201 "data_size": 63488 00:22:56.201 } 00:22:56.201 ] 00:22:56.201 }' 00:22:56.201 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.201 20:36:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:56.769 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.769 20:36:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:56.769 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:56.769 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:57.028 [2024-07-15 20:36:49.187399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:57.028 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:57.028 "name": "Existed_Raid", 00:22:57.028 "uuid": "941a1b64-cd10-4566-b749-64ede524de1d", 00:22:57.028 "strip_size_kb": 0, 00:22:57.028 "state": "configuring", 00:22:57.028 "raid_level": "raid1", 00:22:57.028 "superblock": true, 00:22:57.028 "num_base_bdevs": 4, 00:22:57.028 "num_base_bdevs_discovered": 3, 00:22:57.028 "num_base_bdevs_operational": 4, 00:22:57.028 "base_bdevs_list": [ 00:22:57.028 { 00:22:57.028 "name": "BaseBdev1", 00:22:57.028 "uuid": "67fd00aa-7447-4b48-96cd-6d46ac06d94f", 00:22:57.028 "is_configured": true, 00:22:57.028 "data_offset": 2048, 00:22:57.028 "data_size": 63488 00:22:57.028 }, 00:22:57.028 { 00:22:57.028 "name": null, 00:22:57.028 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:22:57.028 "is_configured": false, 00:22:57.028 "data_offset": 2048, 00:22:57.028 "data_size": 63488 00:22:57.028 }, 00:22:57.028 { 00:22:57.028 "name": "BaseBdev3", 00:22:57.028 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:22:57.029 "is_configured": true, 00:22:57.029 "data_offset": 2048, 00:22:57.029 "data_size": 63488 00:22:57.029 }, 00:22:57.029 { 00:22:57.029 "name": "BaseBdev4", 00:22:57.029 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:22:57.029 "is_configured": true, 00:22:57.029 "data_offset": 2048, 00:22:57.029 "data_size": 63488 00:22:57.029 } 00:22:57.029 ] 00:22:57.029 }' 00:22:57.029 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:57.029 20:36:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:57.967 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.967 20:36:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:57.967 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:57.967 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:58.226 [2024-07-15 20:36:50.402649] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.226 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:58.485 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.485 "name": "Existed_Raid", 00:22:58.485 "uuid": "941a1b64-cd10-4566-b749-64ede524de1d", 00:22:58.485 "strip_size_kb": 0, 00:22:58.485 "state": "configuring", 00:22:58.485 "raid_level": "raid1", 00:22:58.485 "superblock": true, 00:22:58.485 "num_base_bdevs": 4, 00:22:58.485 "num_base_bdevs_discovered": 2, 00:22:58.485 "num_base_bdevs_operational": 4, 00:22:58.485 "base_bdevs_list": [ 00:22:58.485 { 00:22:58.485 "name": null, 00:22:58.485 "uuid": "67fd00aa-7447-4b48-96cd-6d46ac06d94f", 00:22:58.485 "is_configured": false, 00:22:58.485 "data_offset": 2048, 00:22:58.485 "data_size": 63488 00:22:58.485 }, 00:22:58.485 { 00:22:58.485 "name": null, 00:22:58.485 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:22:58.485 "is_configured": false, 00:22:58.485 "data_offset": 2048, 00:22:58.485 "data_size": 63488 00:22:58.485 }, 00:22:58.485 { 00:22:58.485 "name": "BaseBdev3", 00:22:58.485 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:22:58.485 "is_configured": true, 00:22:58.485 "data_offset": 2048, 00:22:58.485 "data_size": 63488 00:22:58.485 }, 00:22:58.485 { 00:22:58.485 "name": "BaseBdev4", 00:22:58.485 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:22:58.485 "is_configured": true, 00:22:58.485 "data_offset": 2048, 00:22:58.485 "data_size": 63488 00:22:58.485 } 00:22:58.485 ] 00:22:58.485 }' 00:22:58.485 20:36:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.485 20:36:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:59.054 20:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.054 20:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:59.623 20:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:59.623 20:36:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:59.882 [2024-07-15 20:36:52.018782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.882 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:00.141 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:00.141 "name": "Existed_Raid", 00:23:00.141 "uuid": "941a1b64-cd10-4566-b749-64ede524de1d", 00:23:00.141 "strip_size_kb": 0, 00:23:00.141 "state": "configuring", 00:23:00.141 "raid_level": "raid1", 00:23:00.141 "superblock": true, 00:23:00.141 "num_base_bdevs": 4, 00:23:00.141 "num_base_bdevs_discovered": 3, 00:23:00.141 "num_base_bdevs_operational": 4, 00:23:00.141 "base_bdevs_list": [ 00:23:00.141 { 00:23:00.141 "name": null, 00:23:00.141 "uuid": "67fd00aa-7447-4b48-96cd-6d46ac06d94f", 00:23:00.141 "is_configured": false, 00:23:00.141 "data_offset": 2048, 00:23:00.141 "data_size": 63488 00:23:00.141 }, 00:23:00.141 { 00:23:00.141 "name": "BaseBdev2", 00:23:00.141 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:23:00.141 "is_configured": true, 00:23:00.141 "data_offset": 2048, 00:23:00.141 "data_size": 63488 00:23:00.141 }, 00:23:00.141 { 00:23:00.141 "name": "BaseBdev3", 00:23:00.141 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:23:00.141 "is_configured": true, 00:23:00.141 "data_offset": 2048, 00:23:00.141 "data_size": 63488 00:23:00.141 }, 00:23:00.141 { 00:23:00.141 "name": "BaseBdev4", 00:23:00.141 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:23:00.141 "is_configured": true, 00:23:00.141 "data_offset": 2048, 00:23:00.141 "data_size": 63488 00:23:00.141 } 00:23:00.141 ] 00:23:00.141 }' 00:23:00.141 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:00.141 20:36:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:00.710 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.710 20:36:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:00.969 20:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:23:00.969 20:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.969 20:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:23:01.228 20:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 67fd00aa-7447-4b48-96cd-6d46ac06d94f 00:23:01.488 [2024-07-15 20:36:53.680207] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:23:01.488 [2024-07-15 20:36:53.680403] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x271b180 00:23:01.488 [2024-07-15 20:36:53.680416] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:01.488 [2024-07-15 20:36:53.680609] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x271bc20 00:23:01.488 [2024-07-15 20:36:53.680753] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x271b180 00:23:01.488 [2024-07-15 20:36:53.680763] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x271b180 00:23:01.488 [2024-07-15 20:36:53.680868] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:01.488 NewBaseBdev 00:23:01.488 20:36:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:23:01.488 20:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:23:01.488 20:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:01.488 20:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:01.488 20:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:01.488 20:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:01.488 20:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:01.747 20:36:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:23:02.006 [ 00:23:02.006 { 00:23:02.006 "name": "NewBaseBdev", 00:23:02.006 "aliases": [ 00:23:02.006 "67fd00aa-7447-4b48-96cd-6d46ac06d94f" 00:23:02.006 ], 00:23:02.006 "product_name": "Malloc disk", 00:23:02.006 "block_size": 512, 00:23:02.006 "num_blocks": 65536, 00:23:02.006 "uuid": "67fd00aa-7447-4b48-96cd-6d46ac06d94f", 00:23:02.006 "assigned_rate_limits": { 00:23:02.006 "rw_ios_per_sec": 0, 00:23:02.006 "rw_mbytes_per_sec": 0, 00:23:02.006 "r_mbytes_per_sec": 0, 00:23:02.006 "w_mbytes_per_sec": 0 00:23:02.006 }, 00:23:02.006 "claimed": true, 00:23:02.006 "claim_type": "exclusive_write", 00:23:02.006 "zoned": false, 00:23:02.006 "supported_io_types": { 00:23:02.006 "read": true, 00:23:02.006 "write": true, 00:23:02.006 "unmap": true, 00:23:02.006 "flush": true, 00:23:02.006 "reset": true, 00:23:02.006 "nvme_admin": false, 00:23:02.006 "nvme_io": false, 00:23:02.006 "nvme_io_md": false, 00:23:02.006 "write_zeroes": true, 00:23:02.006 "zcopy": true, 00:23:02.006 "get_zone_info": false, 00:23:02.006 "zone_management": false, 00:23:02.006 "zone_append": false, 00:23:02.006 "compare": false, 00:23:02.006 "compare_and_write": false, 00:23:02.006 "abort": true, 00:23:02.006 "seek_hole": false, 00:23:02.006 "seek_data": false, 00:23:02.006 "copy": true, 00:23:02.006 "nvme_iov_md": false 00:23:02.006 }, 00:23:02.006 "memory_domains": [ 00:23:02.006 { 00:23:02.006 "dma_device_id": "system", 00:23:02.006 "dma_device_type": 1 00:23:02.006 }, 00:23:02.006 { 00:23:02.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:02.006 "dma_device_type": 2 00:23:02.006 } 00:23:02.006 ], 00:23:02.006 "driver_specific": {} 00:23:02.006 } 00:23:02.006 ] 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.006 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:02.265 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.265 "name": "Existed_Raid", 00:23:02.265 "uuid": "941a1b64-cd10-4566-b749-64ede524de1d", 00:23:02.265 "strip_size_kb": 0, 00:23:02.265 "state": "online", 00:23:02.265 "raid_level": "raid1", 00:23:02.265 "superblock": true, 00:23:02.265 "num_base_bdevs": 4, 00:23:02.265 "num_base_bdevs_discovered": 4, 00:23:02.265 "num_base_bdevs_operational": 4, 00:23:02.265 "base_bdevs_list": [ 00:23:02.265 { 00:23:02.265 "name": "NewBaseBdev", 00:23:02.265 "uuid": "67fd00aa-7447-4b48-96cd-6d46ac06d94f", 00:23:02.265 "is_configured": true, 00:23:02.265 "data_offset": 2048, 00:23:02.265 "data_size": 63488 00:23:02.265 }, 00:23:02.265 { 00:23:02.265 "name": "BaseBdev2", 00:23:02.265 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:23:02.265 "is_configured": true, 00:23:02.265 "data_offset": 2048, 00:23:02.265 "data_size": 63488 00:23:02.265 }, 00:23:02.265 { 00:23:02.265 "name": "BaseBdev3", 00:23:02.265 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:23:02.265 "is_configured": true, 00:23:02.265 "data_offset": 2048, 00:23:02.265 "data_size": 63488 00:23:02.265 }, 00:23:02.265 { 00:23:02.265 "name": "BaseBdev4", 00:23:02.265 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:23:02.265 "is_configured": true, 00:23:02.265 "data_offset": 2048, 00:23:02.265 "data_size": 63488 00:23:02.265 } 00:23:02.265 ] 00:23:02.265 }' 00:23:02.265 20:36:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.265 20:36:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:02.833 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:23:02.833 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:02.833 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:02.833 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:02.833 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:02.833 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:23:02.833 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:02.833 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:03.093 [2024-07-15 20:36:55.240704] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:03.093 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:03.093 "name": "Existed_Raid", 00:23:03.093 "aliases": [ 00:23:03.093 "941a1b64-cd10-4566-b749-64ede524de1d" 00:23:03.093 ], 00:23:03.093 "product_name": "Raid Volume", 00:23:03.093 "block_size": 512, 00:23:03.093 "num_blocks": 63488, 00:23:03.093 "uuid": "941a1b64-cd10-4566-b749-64ede524de1d", 00:23:03.093 "assigned_rate_limits": { 00:23:03.093 "rw_ios_per_sec": 0, 00:23:03.093 "rw_mbytes_per_sec": 0, 00:23:03.093 "r_mbytes_per_sec": 0, 00:23:03.093 "w_mbytes_per_sec": 0 00:23:03.093 }, 00:23:03.093 "claimed": false, 00:23:03.093 "zoned": false, 00:23:03.093 "supported_io_types": { 00:23:03.093 "read": true, 00:23:03.093 "write": true, 00:23:03.093 "unmap": false, 00:23:03.093 "flush": false, 00:23:03.093 "reset": true, 00:23:03.093 "nvme_admin": false, 00:23:03.093 "nvme_io": false, 00:23:03.093 "nvme_io_md": false, 00:23:03.093 "write_zeroes": true, 00:23:03.093 "zcopy": false, 00:23:03.093 "get_zone_info": false, 00:23:03.093 "zone_management": false, 00:23:03.093 "zone_append": false, 00:23:03.093 "compare": false, 00:23:03.093 "compare_and_write": false, 00:23:03.093 "abort": false, 00:23:03.093 "seek_hole": false, 00:23:03.093 "seek_data": false, 00:23:03.093 "copy": false, 00:23:03.093 "nvme_iov_md": false 00:23:03.093 }, 00:23:03.093 "memory_domains": [ 00:23:03.093 { 00:23:03.093 "dma_device_id": "system", 00:23:03.093 "dma_device_type": 1 00:23:03.093 }, 00:23:03.093 { 00:23:03.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.093 "dma_device_type": 2 00:23:03.093 }, 00:23:03.093 { 00:23:03.093 "dma_device_id": "system", 00:23:03.093 "dma_device_type": 1 00:23:03.093 }, 00:23:03.093 { 00:23:03.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.093 "dma_device_type": 2 00:23:03.093 }, 00:23:03.093 { 00:23:03.093 "dma_device_id": "system", 00:23:03.093 "dma_device_type": 1 00:23:03.093 }, 00:23:03.093 { 00:23:03.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.093 "dma_device_type": 2 00:23:03.093 }, 00:23:03.093 { 00:23:03.093 "dma_device_id": "system", 00:23:03.093 "dma_device_type": 1 00:23:03.093 }, 00:23:03.093 { 00:23:03.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.093 "dma_device_type": 2 00:23:03.093 } 00:23:03.093 ], 00:23:03.093 "driver_specific": { 00:23:03.093 "raid": { 00:23:03.093 "uuid": "941a1b64-cd10-4566-b749-64ede524de1d", 00:23:03.093 "strip_size_kb": 0, 00:23:03.093 "state": "online", 00:23:03.093 "raid_level": "raid1", 00:23:03.093 "superblock": true, 00:23:03.093 "num_base_bdevs": 4, 00:23:03.093 "num_base_bdevs_discovered": 4, 00:23:03.093 "num_base_bdevs_operational": 4, 00:23:03.093 "base_bdevs_list": [ 00:23:03.093 { 00:23:03.093 "name": "NewBaseBdev", 00:23:03.093 "uuid": "67fd00aa-7447-4b48-96cd-6d46ac06d94f", 00:23:03.093 "is_configured": true, 00:23:03.093 "data_offset": 2048, 00:23:03.093 "data_size": 63488 00:23:03.093 }, 00:23:03.093 { 00:23:03.093 "name": "BaseBdev2", 00:23:03.093 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:23:03.093 "is_configured": true, 00:23:03.093 "data_offset": 2048, 00:23:03.093 "data_size": 63488 00:23:03.093 }, 00:23:03.093 { 00:23:03.093 "name": "BaseBdev3", 00:23:03.093 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:23:03.093 "is_configured": true, 00:23:03.093 "data_offset": 2048, 00:23:03.093 "data_size": 63488 00:23:03.093 }, 00:23:03.093 { 00:23:03.093 "name": "BaseBdev4", 00:23:03.093 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:23:03.093 "is_configured": true, 00:23:03.093 "data_offset": 2048, 00:23:03.093 "data_size": 63488 00:23:03.093 } 00:23:03.093 ] 00:23:03.093 } 00:23:03.093 } 00:23:03.093 }' 00:23:03.093 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:03.093 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:23:03.093 BaseBdev2 00:23:03.093 BaseBdev3 00:23:03.093 BaseBdev4' 00:23:03.093 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:03.093 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:23:03.093 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:03.352 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:03.352 "name": "NewBaseBdev", 00:23:03.352 "aliases": [ 00:23:03.352 "67fd00aa-7447-4b48-96cd-6d46ac06d94f" 00:23:03.352 ], 00:23:03.352 "product_name": "Malloc disk", 00:23:03.352 "block_size": 512, 00:23:03.352 "num_blocks": 65536, 00:23:03.352 "uuid": "67fd00aa-7447-4b48-96cd-6d46ac06d94f", 00:23:03.352 "assigned_rate_limits": { 00:23:03.352 "rw_ios_per_sec": 0, 00:23:03.352 "rw_mbytes_per_sec": 0, 00:23:03.352 "r_mbytes_per_sec": 0, 00:23:03.352 "w_mbytes_per_sec": 0 00:23:03.352 }, 00:23:03.352 "claimed": true, 00:23:03.352 "claim_type": "exclusive_write", 00:23:03.352 "zoned": false, 00:23:03.352 "supported_io_types": { 00:23:03.352 "read": true, 00:23:03.352 "write": true, 00:23:03.352 "unmap": true, 00:23:03.352 "flush": true, 00:23:03.352 "reset": true, 00:23:03.352 "nvme_admin": false, 00:23:03.352 "nvme_io": false, 00:23:03.352 "nvme_io_md": false, 00:23:03.352 "write_zeroes": true, 00:23:03.352 "zcopy": true, 00:23:03.352 "get_zone_info": false, 00:23:03.352 "zone_management": false, 00:23:03.352 "zone_append": false, 00:23:03.352 "compare": false, 00:23:03.352 "compare_and_write": false, 00:23:03.352 "abort": true, 00:23:03.352 "seek_hole": false, 00:23:03.352 "seek_data": false, 00:23:03.352 "copy": true, 00:23:03.352 "nvme_iov_md": false 00:23:03.352 }, 00:23:03.352 "memory_domains": [ 00:23:03.352 { 00:23:03.352 "dma_device_id": "system", 00:23:03.352 "dma_device_type": 1 00:23:03.352 }, 00:23:03.352 { 00:23:03.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.352 "dma_device_type": 2 00:23:03.352 } 00:23:03.352 ], 00:23:03.352 "driver_specific": {} 00:23:03.352 }' 00:23:03.352 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:03.352 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:03.352 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:03.352 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:03.352 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:03.352 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:03.352 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:03.610 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:03.610 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:03.610 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:03.610 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:03.610 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:03.610 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:03.610 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:03.610 20:36:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:03.868 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:03.868 "name": "BaseBdev2", 00:23:03.868 "aliases": [ 00:23:03.868 "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5" 00:23:03.868 ], 00:23:03.868 "product_name": "Malloc disk", 00:23:03.868 "block_size": 512, 00:23:03.868 "num_blocks": 65536, 00:23:03.868 "uuid": "483f2349-ea09-41ae-a0fe-9f5f21d7a6f5", 00:23:03.868 "assigned_rate_limits": { 00:23:03.868 "rw_ios_per_sec": 0, 00:23:03.868 "rw_mbytes_per_sec": 0, 00:23:03.868 "r_mbytes_per_sec": 0, 00:23:03.868 "w_mbytes_per_sec": 0 00:23:03.868 }, 00:23:03.868 "claimed": true, 00:23:03.868 "claim_type": "exclusive_write", 00:23:03.868 "zoned": false, 00:23:03.868 "supported_io_types": { 00:23:03.868 "read": true, 00:23:03.868 "write": true, 00:23:03.868 "unmap": true, 00:23:03.868 "flush": true, 00:23:03.868 "reset": true, 00:23:03.868 "nvme_admin": false, 00:23:03.868 "nvme_io": false, 00:23:03.868 "nvme_io_md": false, 00:23:03.868 "write_zeroes": true, 00:23:03.868 "zcopy": true, 00:23:03.868 "get_zone_info": false, 00:23:03.868 "zone_management": false, 00:23:03.868 "zone_append": false, 00:23:03.868 "compare": false, 00:23:03.868 "compare_and_write": false, 00:23:03.868 "abort": true, 00:23:03.868 "seek_hole": false, 00:23:03.868 "seek_data": false, 00:23:03.868 "copy": true, 00:23:03.868 "nvme_iov_md": false 00:23:03.868 }, 00:23:03.868 "memory_domains": [ 00:23:03.868 { 00:23:03.868 "dma_device_id": "system", 00:23:03.868 "dma_device_type": 1 00:23:03.868 }, 00:23:03.868 { 00:23:03.868 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.868 "dma_device_type": 2 00:23:03.868 } 00:23:03.868 ], 00:23:03.868 "driver_specific": {} 00:23:03.868 }' 00:23:03.868 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:03.868 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:03.868 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:03.868 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:04.126 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:04.384 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:04.384 "name": "BaseBdev3", 00:23:04.384 "aliases": [ 00:23:04.384 "281e7b86-6f77-4d0e-9944-7b0654b9120f" 00:23:04.384 ], 00:23:04.384 "product_name": "Malloc disk", 00:23:04.384 "block_size": 512, 00:23:04.384 "num_blocks": 65536, 00:23:04.384 "uuid": "281e7b86-6f77-4d0e-9944-7b0654b9120f", 00:23:04.384 "assigned_rate_limits": { 00:23:04.384 "rw_ios_per_sec": 0, 00:23:04.384 "rw_mbytes_per_sec": 0, 00:23:04.384 "r_mbytes_per_sec": 0, 00:23:04.384 "w_mbytes_per_sec": 0 00:23:04.384 }, 00:23:04.384 "claimed": true, 00:23:04.384 "claim_type": "exclusive_write", 00:23:04.384 "zoned": false, 00:23:04.384 "supported_io_types": { 00:23:04.384 "read": true, 00:23:04.384 "write": true, 00:23:04.384 "unmap": true, 00:23:04.384 "flush": true, 00:23:04.384 "reset": true, 00:23:04.384 "nvme_admin": false, 00:23:04.384 "nvme_io": false, 00:23:04.384 "nvme_io_md": false, 00:23:04.384 "write_zeroes": true, 00:23:04.384 "zcopy": true, 00:23:04.384 "get_zone_info": false, 00:23:04.384 "zone_management": false, 00:23:04.384 "zone_append": false, 00:23:04.384 "compare": false, 00:23:04.384 "compare_and_write": false, 00:23:04.384 "abort": true, 00:23:04.384 "seek_hole": false, 00:23:04.384 "seek_data": false, 00:23:04.384 "copy": true, 00:23:04.384 "nvme_iov_md": false 00:23:04.384 }, 00:23:04.384 "memory_domains": [ 00:23:04.384 { 00:23:04.384 "dma_device_id": "system", 00:23:04.384 "dma_device_type": 1 00:23:04.384 }, 00:23:04.384 { 00:23:04.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:04.384 "dma_device_type": 2 00:23:04.384 } 00:23:04.384 ], 00:23:04.384 "driver_specific": {} 00:23:04.384 }' 00:23:04.384 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:04.643 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:04.643 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:04.643 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:04.643 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:04.643 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:04.643 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:04.643 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:04.643 20:36:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:04.643 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:04.902 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:04.902 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:04.902 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:04.902 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:04.902 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:05.161 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:05.161 "name": "BaseBdev4", 00:23:05.161 "aliases": [ 00:23:05.161 "69802234-6ec7-4d84-a79f-d616937fadd6" 00:23:05.161 ], 00:23:05.161 "product_name": "Malloc disk", 00:23:05.161 "block_size": 512, 00:23:05.161 "num_blocks": 65536, 00:23:05.161 "uuid": "69802234-6ec7-4d84-a79f-d616937fadd6", 00:23:05.161 "assigned_rate_limits": { 00:23:05.161 "rw_ios_per_sec": 0, 00:23:05.161 "rw_mbytes_per_sec": 0, 00:23:05.161 "r_mbytes_per_sec": 0, 00:23:05.161 "w_mbytes_per_sec": 0 00:23:05.161 }, 00:23:05.161 "claimed": true, 00:23:05.161 "claim_type": "exclusive_write", 00:23:05.161 "zoned": false, 00:23:05.161 "supported_io_types": { 00:23:05.161 "read": true, 00:23:05.161 "write": true, 00:23:05.161 "unmap": true, 00:23:05.161 "flush": true, 00:23:05.161 "reset": true, 00:23:05.161 "nvme_admin": false, 00:23:05.161 "nvme_io": false, 00:23:05.161 "nvme_io_md": false, 00:23:05.161 "write_zeroes": true, 00:23:05.161 "zcopy": true, 00:23:05.161 "get_zone_info": false, 00:23:05.161 "zone_management": false, 00:23:05.161 "zone_append": false, 00:23:05.161 "compare": false, 00:23:05.161 "compare_and_write": false, 00:23:05.161 "abort": true, 00:23:05.161 "seek_hole": false, 00:23:05.161 "seek_data": false, 00:23:05.161 "copy": true, 00:23:05.161 "nvme_iov_md": false 00:23:05.161 }, 00:23:05.161 "memory_domains": [ 00:23:05.161 { 00:23:05.161 "dma_device_id": "system", 00:23:05.161 "dma_device_type": 1 00:23:05.161 }, 00:23:05.161 { 00:23:05.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.161 "dma_device_type": 2 00:23:05.161 } 00:23:05.161 ], 00:23:05.161 "driver_specific": {} 00:23:05.161 }' 00:23:05.161 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.161 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.161 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:05.161 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:05.161 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:05.161 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:05.161 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:05.421 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:05.421 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:05.421 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:05.421 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:05.421 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:05.421 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:05.680 [2024-07-15 20:36:57.907571] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:05.680 [2024-07-15 20:36:57.907604] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:05.680 [2024-07-15 20:36:57.907659] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:05.680 [2024-07-15 20:36:57.907969] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:05.680 [2024-07-15 20:36:57.907983] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x271b180 name Existed_Raid, state offline 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1449996 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1449996 ']' 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1449996 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1449996 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1449996' 00:23:05.680 killing process with pid 1449996 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1449996 00:23:05.680 [2024-07-15 20:36:57.975015] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:05.680 20:36:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1449996 00:23:05.680 [2024-07-15 20:36:58.042454] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:06.248 20:36:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:23:06.248 00:23:06.248 real 0m31.573s 00:23:06.248 user 0m57.802s 00:23:06.248 sys 0m5.764s 00:23:06.248 20:36:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:06.248 20:36:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:06.248 ************************************ 00:23:06.248 END TEST raid_state_function_test_sb 00:23:06.248 ************************************ 00:23:06.249 20:36:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:06.249 20:36:58 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:23:06.249 20:36:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:23:06.249 20:36:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:06.249 20:36:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:06.249 ************************************ 00:23:06.249 START TEST raid_superblock_test 00:23:06.249 ************************************ 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1454717 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1454717 /var/tmp/spdk-raid.sock 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1454717 ']' 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:06.249 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:06.249 20:36:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:06.249 [2024-07-15 20:36:58.522363] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:23:06.249 [2024-07-15 20:36:58.522433] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1454717 ] 00:23:06.508 [2024-07-15 20:36:58.652984] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:06.509 [2024-07-15 20:36:58.757040] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:06.509 [2024-07-15 20:36:58.814996] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:06.509 [2024-07-15 20:36:58.815029] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:07.077 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:23:07.336 malloc1 00:23:07.336 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:07.595 [2024-07-15 20:36:59.799802] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:07.595 [2024-07-15 20:36:59.799851] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:07.595 [2024-07-15 20:36:59.799871] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b06570 00:23:07.595 [2024-07-15 20:36:59.799883] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:07.595 [2024-07-15 20:36:59.801496] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:07.595 [2024-07-15 20:36:59.801524] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:07.595 pt1 00:23:07.595 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:07.595 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:07.595 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:23:07.595 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:23:07.595 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:07.595 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:07.595 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:07.595 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:07.595 20:36:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:23:07.854 malloc2 00:23:07.854 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:08.113 [2024-07-15 20:37:00.293933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:08.113 [2024-07-15 20:37:00.293981] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.113 [2024-07-15 20:37:00.293999] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b07970 00:23:08.113 [2024-07-15 20:37:00.294011] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.113 [2024-07-15 20:37:00.295496] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.113 [2024-07-15 20:37:00.295522] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:08.113 pt2 00:23:08.113 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:08.113 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:08.113 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:23:08.113 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:23:08.113 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:23:08.113 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:08.113 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:08.113 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:08.113 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:23:08.373 malloc3 00:23:08.373 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:08.690 [2024-07-15 20:37:00.795868] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:08.690 [2024-07-15 20:37:00.795915] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.690 [2024-07-15 20:37:00.795941] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c9e340 00:23:08.690 [2024-07-15 20:37:00.795954] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.690 [2024-07-15 20:37:00.797355] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.690 [2024-07-15 20:37:00.797384] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:08.690 pt3 00:23:08.690 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:08.690 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:08.690 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:23:08.690 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:23:08.690 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:23:08.690 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:08.690 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:08.690 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:08.690 20:37:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:23:08.690 malloc4 00:23:08.948 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:08.948 [2024-07-15 20:37:01.293686] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:08.948 [2024-07-15 20:37:01.293731] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.948 [2024-07-15 20:37:01.293749] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ca0c60 00:23:08.948 [2024-07-15 20:37:01.293762] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.948 [2024-07-15 20:37:01.295138] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.948 [2024-07-15 20:37:01.295164] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:08.948 pt4 00:23:08.948 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:08.948 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:08.948 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:23:09.207 [2024-07-15 20:37:01.534365] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:09.207 [2024-07-15 20:37:01.535525] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:09.207 [2024-07-15 20:37:01.535579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:09.207 [2024-07-15 20:37:01.535622] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:09.207 [2024-07-15 20:37:01.535785] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1afe530 00:23:09.207 [2024-07-15 20:37:01.535796] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:09.207 [2024-07-15 20:37:01.535983] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1afc770 00:23:09.207 [2024-07-15 20:37:01.536129] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1afe530 00:23:09.207 [2024-07-15 20:37:01.536140] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1afe530 00:23:09.207 [2024-07-15 20:37:01.536228] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.207 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.466 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.466 "name": "raid_bdev1", 00:23:09.466 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:09.466 "strip_size_kb": 0, 00:23:09.466 "state": "online", 00:23:09.466 "raid_level": "raid1", 00:23:09.466 "superblock": true, 00:23:09.466 "num_base_bdevs": 4, 00:23:09.466 "num_base_bdevs_discovered": 4, 00:23:09.466 "num_base_bdevs_operational": 4, 00:23:09.466 "base_bdevs_list": [ 00:23:09.466 { 00:23:09.466 "name": "pt1", 00:23:09.466 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:09.466 "is_configured": true, 00:23:09.466 "data_offset": 2048, 00:23:09.466 "data_size": 63488 00:23:09.466 }, 00:23:09.466 { 00:23:09.466 "name": "pt2", 00:23:09.466 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:09.466 "is_configured": true, 00:23:09.466 "data_offset": 2048, 00:23:09.466 "data_size": 63488 00:23:09.466 }, 00:23:09.466 { 00:23:09.466 "name": "pt3", 00:23:09.466 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:09.466 "is_configured": true, 00:23:09.466 "data_offset": 2048, 00:23:09.466 "data_size": 63488 00:23:09.466 }, 00:23:09.466 { 00:23:09.466 "name": "pt4", 00:23:09.466 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:09.466 "is_configured": true, 00:23:09.466 "data_offset": 2048, 00:23:09.466 "data_size": 63488 00:23:09.466 } 00:23:09.466 ] 00:23:09.466 }' 00:23:09.466 20:37:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.466 20:37:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:10.033 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:23:10.033 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:10.033 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:10.033 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:10.033 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:10.033 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:10.291 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:10.291 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:10.291 [2024-07-15 20:37:02.637585] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:10.291 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:10.291 "name": "raid_bdev1", 00:23:10.292 "aliases": [ 00:23:10.292 "f1c94521-4807-42e5-811c-5e21c70468a1" 00:23:10.292 ], 00:23:10.292 "product_name": "Raid Volume", 00:23:10.292 "block_size": 512, 00:23:10.292 "num_blocks": 63488, 00:23:10.292 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:10.292 "assigned_rate_limits": { 00:23:10.292 "rw_ios_per_sec": 0, 00:23:10.292 "rw_mbytes_per_sec": 0, 00:23:10.292 "r_mbytes_per_sec": 0, 00:23:10.292 "w_mbytes_per_sec": 0 00:23:10.292 }, 00:23:10.292 "claimed": false, 00:23:10.292 "zoned": false, 00:23:10.292 "supported_io_types": { 00:23:10.292 "read": true, 00:23:10.292 "write": true, 00:23:10.292 "unmap": false, 00:23:10.292 "flush": false, 00:23:10.292 "reset": true, 00:23:10.292 "nvme_admin": false, 00:23:10.292 "nvme_io": false, 00:23:10.292 "nvme_io_md": false, 00:23:10.292 "write_zeroes": true, 00:23:10.292 "zcopy": false, 00:23:10.292 "get_zone_info": false, 00:23:10.292 "zone_management": false, 00:23:10.292 "zone_append": false, 00:23:10.292 "compare": false, 00:23:10.292 "compare_and_write": false, 00:23:10.292 "abort": false, 00:23:10.292 "seek_hole": false, 00:23:10.292 "seek_data": false, 00:23:10.292 "copy": false, 00:23:10.292 "nvme_iov_md": false 00:23:10.292 }, 00:23:10.292 "memory_domains": [ 00:23:10.292 { 00:23:10.292 "dma_device_id": "system", 00:23:10.292 "dma_device_type": 1 00:23:10.292 }, 00:23:10.292 { 00:23:10.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:10.292 "dma_device_type": 2 00:23:10.292 }, 00:23:10.292 { 00:23:10.292 "dma_device_id": "system", 00:23:10.292 "dma_device_type": 1 00:23:10.292 }, 00:23:10.292 { 00:23:10.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:10.292 "dma_device_type": 2 00:23:10.292 }, 00:23:10.292 { 00:23:10.292 "dma_device_id": "system", 00:23:10.292 "dma_device_type": 1 00:23:10.292 }, 00:23:10.292 { 00:23:10.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:10.292 "dma_device_type": 2 00:23:10.292 }, 00:23:10.292 { 00:23:10.292 "dma_device_id": "system", 00:23:10.292 "dma_device_type": 1 00:23:10.292 }, 00:23:10.292 { 00:23:10.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:10.292 "dma_device_type": 2 00:23:10.292 } 00:23:10.292 ], 00:23:10.292 "driver_specific": { 00:23:10.292 "raid": { 00:23:10.292 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:10.292 "strip_size_kb": 0, 00:23:10.292 "state": "online", 00:23:10.292 "raid_level": "raid1", 00:23:10.292 "superblock": true, 00:23:10.292 "num_base_bdevs": 4, 00:23:10.292 "num_base_bdevs_discovered": 4, 00:23:10.292 "num_base_bdevs_operational": 4, 00:23:10.292 "base_bdevs_list": [ 00:23:10.292 { 00:23:10.292 "name": "pt1", 00:23:10.292 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:10.292 "is_configured": true, 00:23:10.292 "data_offset": 2048, 00:23:10.292 "data_size": 63488 00:23:10.292 }, 00:23:10.292 { 00:23:10.292 "name": "pt2", 00:23:10.292 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:10.292 "is_configured": true, 00:23:10.292 "data_offset": 2048, 00:23:10.292 "data_size": 63488 00:23:10.292 }, 00:23:10.292 { 00:23:10.292 "name": "pt3", 00:23:10.292 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:10.292 "is_configured": true, 00:23:10.292 "data_offset": 2048, 00:23:10.292 "data_size": 63488 00:23:10.292 }, 00:23:10.292 { 00:23:10.292 "name": "pt4", 00:23:10.292 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:10.292 "is_configured": true, 00:23:10.292 "data_offset": 2048, 00:23:10.292 "data_size": 63488 00:23:10.292 } 00:23:10.292 ] 00:23:10.292 } 00:23:10.292 } 00:23:10.292 }' 00:23:10.292 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:10.550 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:10.551 pt2 00:23:10.551 pt3 00:23:10.551 pt4' 00:23:10.551 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:10.551 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:10.551 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:10.809 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:10.809 "name": "pt1", 00:23:10.809 "aliases": [ 00:23:10.809 "00000000-0000-0000-0000-000000000001" 00:23:10.809 ], 00:23:10.809 "product_name": "passthru", 00:23:10.809 "block_size": 512, 00:23:10.809 "num_blocks": 65536, 00:23:10.809 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:10.809 "assigned_rate_limits": { 00:23:10.809 "rw_ios_per_sec": 0, 00:23:10.809 "rw_mbytes_per_sec": 0, 00:23:10.809 "r_mbytes_per_sec": 0, 00:23:10.809 "w_mbytes_per_sec": 0 00:23:10.809 }, 00:23:10.809 "claimed": true, 00:23:10.809 "claim_type": "exclusive_write", 00:23:10.809 "zoned": false, 00:23:10.809 "supported_io_types": { 00:23:10.809 "read": true, 00:23:10.809 "write": true, 00:23:10.809 "unmap": true, 00:23:10.809 "flush": true, 00:23:10.809 "reset": true, 00:23:10.809 "nvme_admin": false, 00:23:10.809 "nvme_io": false, 00:23:10.809 "nvme_io_md": false, 00:23:10.809 "write_zeroes": true, 00:23:10.809 "zcopy": true, 00:23:10.809 "get_zone_info": false, 00:23:10.809 "zone_management": false, 00:23:10.809 "zone_append": false, 00:23:10.809 "compare": false, 00:23:10.809 "compare_and_write": false, 00:23:10.809 "abort": true, 00:23:10.809 "seek_hole": false, 00:23:10.809 "seek_data": false, 00:23:10.809 "copy": true, 00:23:10.809 "nvme_iov_md": false 00:23:10.809 }, 00:23:10.809 "memory_domains": [ 00:23:10.809 { 00:23:10.809 "dma_device_id": "system", 00:23:10.809 "dma_device_type": 1 00:23:10.809 }, 00:23:10.809 { 00:23:10.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:10.809 "dma_device_type": 2 00:23:10.809 } 00:23:10.809 ], 00:23:10.809 "driver_specific": { 00:23:10.809 "passthru": { 00:23:10.809 "name": "pt1", 00:23:10.809 "base_bdev_name": "malloc1" 00:23:10.809 } 00:23:10.809 } 00:23:10.809 }' 00:23:10.809 20:37:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:10.809 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:10.809 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:10.809 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:10.809 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:10.809 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:10.809 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:10.809 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:11.072 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:11.072 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:11.072 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:11.072 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:11.072 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:11.072 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:11.072 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:11.331 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:11.331 "name": "pt2", 00:23:11.331 "aliases": [ 00:23:11.331 "00000000-0000-0000-0000-000000000002" 00:23:11.331 ], 00:23:11.331 "product_name": "passthru", 00:23:11.331 "block_size": 512, 00:23:11.331 "num_blocks": 65536, 00:23:11.331 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:11.331 "assigned_rate_limits": { 00:23:11.331 "rw_ios_per_sec": 0, 00:23:11.331 "rw_mbytes_per_sec": 0, 00:23:11.331 "r_mbytes_per_sec": 0, 00:23:11.331 "w_mbytes_per_sec": 0 00:23:11.331 }, 00:23:11.331 "claimed": true, 00:23:11.331 "claim_type": "exclusive_write", 00:23:11.331 "zoned": false, 00:23:11.331 "supported_io_types": { 00:23:11.331 "read": true, 00:23:11.331 "write": true, 00:23:11.331 "unmap": true, 00:23:11.331 "flush": true, 00:23:11.331 "reset": true, 00:23:11.331 "nvme_admin": false, 00:23:11.331 "nvme_io": false, 00:23:11.331 "nvme_io_md": false, 00:23:11.331 "write_zeroes": true, 00:23:11.331 "zcopy": true, 00:23:11.331 "get_zone_info": false, 00:23:11.331 "zone_management": false, 00:23:11.331 "zone_append": false, 00:23:11.331 "compare": false, 00:23:11.331 "compare_and_write": false, 00:23:11.331 "abort": true, 00:23:11.331 "seek_hole": false, 00:23:11.331 "seek_data": false, 00:23:11.331 "copy": true, 00:23:11.331 "nvme_iov_md": false 00:23:11.331 }, 00:23:11.331 "memory_domains": [ 00:23:11.331 { 00:23:11.331 "dma_device_id": "system", 00:23:11.331 "dma_device_type": 1 00:23:11.331 }, 00:23:11.331 { 00:23:11.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:11.331 "dma_device_type": 2 00:23:11.331 } 00:23:11.331 ], 00:23:11.331 "driver_specific": { 00:23:11.331 "passthru": { 00:23:11.331 "name": "pt2", 00:23:11.331 "base_bdev_name": "malloc2" 00:23:11.331 } 00:23:11.331 } 00:23:11.331 }' 00:23:11.331 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:11.331 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:11.331 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:11.331 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:11.331 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:11.590 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:11.591 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:11.591 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:11.591 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:11.591 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:11.591 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:11.591 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:11.591 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:11.591 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:11.591 20:37:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:11.850 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:11.850 "name": "pt3", 00:23:11.850 "aliases": [ 00:23:11.850 "00000000-0000-0000-0000-000000000003" 00:23:11.850 ], 00:23:11.850 "product_name": "passthru", 00:23:11.850 "block_size": 512, 00:23:11.850 "num_blocks": 65536, 00:23:11.850 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:11.850 "assigned_rate_limits": { 00:23:11.850 "rw_ios_per_sec": 0, 00:23:11.850 "rw_mbytes_per_sec": 0, 00:23:11.850 "r_mbytes_per_sec": 0, 00:23:11.850 "w_mbytes_per_sec": 0 00:23:11.850 }, 00:23:11.850 "claimed": true, 00:23:11.850 "claim_type": "exclusive_write", 00:23:11.850 "zoned": false, 00:23:11.850 "supported_io_types": { 00:23:11.850 "read": true, 00:23:11.850 "write": true, 00:23:11.850 "unmap": true, 00:23:11.850 "flush": true, 00:23:11.850 "reset": true, 00:23:11.850 "nvme_admin": false, 00:23:11.850 "nvme_io": false, 00:23:11.850 "nvme_io_md": false, 00:23:11.850 "write_zeroes": true, 00:23:11.850 "zcopy": true, 00:23:11.850 "get_zone_info": false, 00:23:11.850 "zone_management": false, 00:23:11.850 "zone_append": false, 00:23:11.850 "compare": false, 00:23:11.850 "compare_and_write": false, 00:23:11.850 "abort": true, 00:23:11.850 "seek_hole": false, 00:23:11.850 "seek_data": false, 00:23:11.850 "copy": true, 00:23:11.850 "nvme_iov_md": false 00:23:11.850 }, 00:23:11.850 "memory_domains": [ 00:23:11.850 { 00:23:11.850 "dma_device_id": "system", 00:23:11.850 "dma_device_type": 1 00:23:11.850 }, 00:23:11.850 { 00:23:11.850 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:11.850 "dma_device_type": 2 00:23:11.850 } 00:23:11.850 ], 00:23:11.850 "driver_specific": { 00:23:11.850 "passthru": { 00:23:11.850 "name": "pt3", 00:23:11.850 "base_bdev_name": "malloc3" 00:23:11.850 } 00:23:11.850 } 00:23:11.850 }' 00:23:11.850 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:11.850 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:12.109 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:12.109 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:12.109 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:12.109 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:12.109 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:12.109 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:12.109 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:12.109 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:12.109 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:12.368 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:12.368 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:12.368 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:12.368 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:12.627 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:12.627 "name": "pt4", 00:23:12.627 "aliases": [ 00:23:12.627 "00000000-0000-0000-0000-000000000004" 00:23:12.627 ], 00:23:12.627 "product_name": "passthru", 00:23:12.627 "block_size": 512, 00:23:12.627 "num_blocks": 65536, 00:23:12.627 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:12.627 "assigned_rate_limits": { 00:23:12.627 "rw_ios_per_sec": 0, 00:23:12.627 "rw_mbytes_per_sec": 0, 00:23:12.627 "r_mbytes_per_sec": 0, 00:23:12.627 "w_mbytes_per_sec": 0 00:23:12.627 }, 00:23:12.627 "claimed": true, 00:23:12.627 "claim_type": "exclusive_write", 00:23:12.627 "zoned": false, 00:23:12.627 "supported_io_types": { 00:23:12.627 "read": true, 00:23:12.627 "write": true, 00:23:12.627 "unmap": true, 00:23:12.627 "flush": true, 00:23:12.627 "reset": true, 00:23:12.627 "nvme_admin": false, 00:23:12.627 "nvme_io": false, 00:23:12.627 "nvme_io_md": false, 00:23:12.627 "write_zeroes": true, 00:23:12.627 "zcopy": true, 00:23:12.627 "get_zone_info": false, 00:23:12.627 "zone_management": false, 00:23:12.627 "zone_append": false, 00:23:12.627 "compare": false, 00:23:12.627 "compare_and_write": false, 00:23:12.627 "abort": true, 00:23:12.627 "seek_hole": false, 00:23:12.627 "seek_data": false, 00:23:12.627 "copy": true, 00:23:12.627 "nvme_iov_md": false 00:23:12.627 }, 00:23:12.627 "memory_domains": [ 00:23:12.627 { 00:23:12.627 "dma_device_id": "system", 00:23:12.627 "dma_device_type": 1 00:23:12.627 }, 00:23:12.627 { 00:23:12.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:12.627 "dma_device_type": 2 00:23:12.627 } 00:23:12.627 ], 00:23:12.627 "driver_specific": { 00:23:12.627 "passthru": { 00:23:12.627 "name": "pt4", 00:23:12.627 "base_bdev_name": "malloc4" 00:23:12.627 } 00:23:12.627 } 00:23:12.627 }' 00:23:12.627 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:12.627 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:12.627 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:12.627 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:12.627 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:12.627 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:12.627 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:12.627 20:37:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:12.886 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:12.886 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:12.886 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:12.886 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:12.886 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:23:12.886 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:13.144 [2024-07-15 20:37:05.348740] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:13.144 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=f1c94521-4807-42e5-811c-5e21c70468a1 00:23:13.144 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z f1c94521-4807-42e5-811c-5e21c70468a1 ']' 00:23:13.144 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:13.418 [2024-07-15 20:37:05.601100] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:13.418 [2024-07-15 20:37:05.601123] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:13.418 [2024-07-15 20:37:05.601173] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:13.418 [2024-07-15 20:37:05.601258] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:13.418 [2024-07-15 20:37:05.601271] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1afe530 name raid_bdev1, state offline 00:23:13.418 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.418 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:23:13.677 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:23:13.677 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:23:13.677 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:13.677 20:37:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:13.934 20:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:13.934 20:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:14.192 20:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:14.192 20:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:14.450 20:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:14.450 20:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:14.450 20:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:14.708 20:37:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:14.708 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:14.967 [2024-07-15 20:37:07.305544] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:14.967 [2024-07-15 20:37:07.306953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:14.967 [2024-07-15 20:37:07.307000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:23:14.967 [2024-07-15 20:37:07.307035] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:23:14.967 [2024-07-15 20:37:07.307082] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:14.967 [2024-07-15 20:37:07.307122] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:14.967 [2024-07-15 20:37:07.307146] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:23:14.967 [2024-07-15 20:37:07.307174] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:23:14.967 [2024-07-15 20:37:07.307193] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:14.967 [2024-07-15 20:37:07.307204] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ca9ff0 name raid_bdev1, state configuring 00:23:14.967 request: 00:23:14.967 { 00:23:14.967 "name": "raid_bdev1", 00:23:14.967 "raid_level": "raid1", 00:23:14.967 "base_bdevs": [ 00:23:14.967 "malloc1", 00:23:14.967 "malloc2", 00:23:14.967 "malloc3", 00:23:14.967 "malloc4" 00:23:14.967 ], 00:23:14.967 "superblock": false, 00:23:14.967 "method": "bdev_raid_create", 00:23:14.967 "req_id": 1 00:23:14.967 } 00:23:14.967 Got JSON-RPC error response 00:23:14.967 response: 00:23:14.967 { 00:23:14.967 "code": -17, 00:23:14.967 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:14.967 } 00:23:14.967 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:23:14.967 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:14.967 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:14.967 20:37:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:14.967 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.967 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:23:15.226 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:23:15.226 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:23:15.226 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:15.485 [2024-07-15 20:37:07.734620] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:15.485 [2024-07-15 20:37:07.734666] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:15.485 [2024-07-15 20:37:07.734687] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b067a0 00:23:15.485 [2024-07-15 20:37:07.734699] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:15.485 [2024-07-15 20:37:07.736311] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:15.485 [2024-07-15 20:37:07.736340] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:15.485 [2024-07-15 20:37:07.736407] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:15.485 [2024-07-15 20:37:07.736434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:15.485 pt1 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.485 20:37:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.743 20:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:15.743 "name": "raid_bdev1", 00:23:15.743 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:15.743 "strip_size_kb": 0, 00:23:15.743 "state": "configuring", 00:23:15.743 "raid_level": "raid1", 00:23:15.743 "superblock": true, 00:23:15.743 "num_base_bdevs": 4, 00:23:15.743 "num_base_bdevs_discovered": 1, 00:23:15.743 "num_base_bdevs_operational": 4, 00:23:15.743 "base_bdevs_list": [ 00:23:15.743 { 00:23:15.743 "name": "pt1", 00:23:15.743 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:15.743 "is_configured": true, 00:23:15.743 "data_offset": 2048, 00:23:15.743 "data_size": 63488 00:23:15.743 }, 00:23:15.743 { 00:23:15.743 "name": null, 00:23:15.743 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:15.743 "is_configured": false, 00:23:15.743 "data_offset": 2048, 00:23:15.743 "data_size": 63488 00:23:15.743 }, 00:23:15.743 { 00:23:15.743 "name": null, 00:23:15.743 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:15.743 "is_configured": false, 00:23:15.743 "data_offset": 2048, 00:23:15.743 "data_size": 63488 00:23:15.743 }, 00:23:15.743 { 00:23:15.743 "name": null, 00:23:15.743 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:15.743 "is_configured": false, 00:23:15.743 "data_offset": 2048, 00:23:15.743 "data_size": 63488 00:23:15.743 } 00:23:15.743 ] 00:23:15.743 }' 00:23:15.743 20:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:15.743 20:37:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:16.310 20:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:23:16.310 20:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:16.568 [2024-07-15 20:37:08.829557] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:16.569 [2024-07-15 20:37:08.829611] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.569 [2024-07-15 20:37:08.829632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c9f940 00:23:16.569 [2024-07-15 20:37:08.829645] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.569 [2024-07-15 20:37:08.830027] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.569 [2024-07-15 20:37:08.830045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:16.569 [2024-07-15 20:37:08.830113] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:16.569 [2024-07-15 20:37:08.830133] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:16.569 pt2 00:23:16.569 20:37:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:16.827 [2024-07-15 20:37:09.070188] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.827 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.086 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.086 "name": "raid_bdev1", 00:23:17.086 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:17.086 "strip_size_kb": 0, 00:23:17.086 "state": "configuring", 00:23:17.086 "raid_level": "raid1", 00:23:17.086 "superblock": true, 00:23:17.086 "num_base_bdevs": 4, 00:23:17.086 "num_base_bdevs_discovered": 1, 00:23:17.086 "num_base_bdevs_operational": 4, 00:23:17.086 "base_bdevs_list": [ 00:23:17.086 { 00:23:17.086 "name": "pt1", 00:23:17.086 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:17.086 "is_configured": true, 00:23:17.086 "data_offset": 2048, 00:23:17.086 "data_size": 63488 00:23:17.086 }, 00:23:17.086 { 00:23:17.086 "name": null, 00:23:17.086 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:17.086 "is_configured": false, 00:23:17.086 "data_offset": 2048, 00:23:17.086 "data_size": 63488 00:23:17.086 }, 00:23:17.086 { 00:23:17.086 "name": null, 00:23:17.086 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:17.086 "is_configured": false, 00:23:17.086 "data_offset": 2048, 00:23:17.086 "data_size": 63488 00:23:17.086 }, 00:23:17.086 { 00:23:17.086 "name": null, 00:23:17.086 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:17.086 "is_configured": false, 00:23:17.086 "data_offset": 2048, 00:23:17.086 "data_size": 63488 00:23:17.086 } 00:23:17.086 ] 00:23:17.086 }' 00:23:17.086 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.086 20:37:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:17.653 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:23:17.653 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:17.653 20:37:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:17.913 [2024-07-15 20:37:10.173102] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:17.913 [2024-07-15 20:37:10.173158] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:17.913 [2024-07-15 20:37:10.173176] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1afd060 00:23:17.913 [2024-07-15 20:37:10.173189] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:17.913 [2024-07-15 20:37:10.173548] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:17.913 [2024-07-15 20:37:10.173566] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:17.913 [2024-07-15 20:37:10.173630] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:17.913 [2024-07-15 20:37:10.173649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:17.913 pt2 00:23:17.913 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:17.913 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:17.913 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:18.171 [2024-07-15 20:37:10.421761] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:18.171 [2024-07-15 20:37:10.421787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:18.171 [2024-07-15 20:37:10.421805] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1aff8d0 00:23:18.171 [2024-07-15 20:37:10.421816] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:18.171 [2024-07-15 20:37:10.422094] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:18.171 [2024-07-15 20:37:10.422112] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:18.171 [2024-07-15 20:37:10.422158] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:18.171 [2024-07-15 20:37:10.422174] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:18.171 pt3 00:23:18.171 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:18.171 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:18.171 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:18.430 [2024-07-15 20:37:10.670424] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:18.430 [2024-07-15 20:37:10.670465] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:18.430 [2024-07-15 20:37:10.670486] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b00b80 00:23:18.430 [2024-07-15 20:37:10.670498] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:18.430 [2024-07-15 20:37:10.670789] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:18.430 [2024-07-15 20:37:10.670805] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:18.430 [2024-07-15 20:37:10.670854] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:18.431 [2024-07-15 20:37:10.670872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:18.431 [2024-07-15 20:37:10.670998] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1afd780 00:23:18.431 [2024-07-15 20:37:10.671009] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:18.431 [2024-07-15 20:37:10.671182] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b02fa0 00:23:18.431 [2024-07-15 20:37:10.671317] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1afd780 00:23:18.431 [2024-07-15 20:37:10.671327] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1afd780 00:23:18.431 [2024-07-15 20:37:10.671424] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:18.431 pt4 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.431 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.700 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:18.700 "name": "raid_bdev1", 00:23:18.700 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:18.700 "strip_size_kb": 0, 00:23:18.700 "state": "online", 00:23:18.700 "raid_level": "raid1", 00:23:18.700 "superblock": true, 00:23:18.700 "num_base_bdevs": 4, 00:23:18.700 "num_base_bdevs_discovered": 4, 00:23:18.700 "num_base_bdevs_operational": 4, 00:23:18.700 "base_bdevs_list": [ 00:23:18.700 { 00:23:18.700 "name": "pt1", 00:23:18.700 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:18.700 "is_configured": true, 00:23:18.700 "data_offset": 2048, 00:23:18.700 "data_size": 63488 00:23:18.700 }, 00:23:18.700 { 00:23:18.700 "name": "pt2", 00:23:18.700 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:18.700 "is_configured": true, 00:23:18.700 "data_offset": 2048, 00:23:18.700 "data_size": 63488 00:23:18.700 }, 00:23:18.700 { 00:23:18.700 "name": "pt3", 00:23:18.700 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:18.700 "is_configured": true, 00:23:18.700 "data_offset": 2048, 00:23:18.700 "data_size": 63488 00:23:18.700 }, 00:23:18.700 { 00:23:18.700 "name": "pt4", 00:23:18.700 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:18.700 "is_configured": true, 00:23:18.700 "data_offset": 2048, 00:23:18.700 "data_size": 63488 00:23:18.700 } 00:23:18.700 ] 00:23:18.701 }' 00:23:18.701 20:37:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:18.701 20:37:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:19.268 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:19.268 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:19.268 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:19.268 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:19.268 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:19.268 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:19.268 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:19.268 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:19.528 [2024-07-15 20:37:11.773673] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:19.528 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:19.528 "name": "raid_bdev1", 00:23:19.528 "aliases": [ 00:23:19.528 "f1c94521-4807-42e5-811c-5e21c70468a1" 00:23:19.528 ], 00:23:19.528 "product_name": "Raid Volume", 00:23:19.528 "block_size": 512, 00:23:19.528 "num_blocks": 63488, 00:23:19.528 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:19.528 "assigned_rate_limits": { 00:23:19.528 "rw_ios_per_sec": 0, 00:23:19.528 "rw_mbytes_per_sec": 0, 00:23:19.528 "r_mbytes_per_sec": 0, 00:23:19.528 "w_mbytes_per_sec": 0 00:23:19.528 }, 00:23:19.528 "claimed": false, 00:23:19.528 "zoned": false, 00:23:19.528 "supported_io_types": { 00:23:19.528 "read": true, 00:23:19.528 "write": true, 00:23:19.528 "unmap": false, 00:23:19.528 "flush": false, 00:23:19.528 "reset": true, 00:23:19.528 "nvme_admin": false, 00:23:19.528 "nvme_io": false, 00:23:19.528 "nvme_io_md": false, 00:23:19.528 "write_zeroes": true, 00:23:19.528 "zcopy": false, 00:23:19.528 "get_zone_info": false, 00:23:19.528 "zone_management": false, 00:23:19.528 "zone_append": false, 00:23:19.528 "compare": false, 00:23:19.528 "compare_and_write": false, 00:23:19.528 "abort": false, 00:23:19.528 "seek_hole": false, 00:23:19.528 "seek_data": false, 00:23:19.528 "copy": false, 00:23:19.528 "nvme_iov_md": false 00:23:19.528 }, 00:23:19.528 "memory_domains": [ 00:23:19.528 { 00:23:19.528 "dma_device_id": "system", 00:23:19.528 "dma_device_type": 1 00:23:19.528 }, 00:23:19.528 { 00:23:19.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:19.528 "dma_device_type": 2 00:23:19.528 }, 00:23:19.528 { 00:23:19.528 "dma_device_id": "system", 00:23:19.528 "dma_device_type": 1 00:23:19.528 }, 00:23:19.528 { 00:23:19.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:19.528 "dma_device_type": 2 00:23:19.528 }, 00:23:19.528 { 00:23:19.528 "dma_device_id": "system", 00:23:19.528 "dma_device_type": 1 00:23:19.528 }, 00:23:19.528 { 00:23:19.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:19.528 "dma_device_type": 2 00:23:19.528 }, 00:23:19.528 { 00:23:19.528 "dma_device_id": "system", 00:23:19.528 "dma_device_type": 1 00:23:19.528 }, 00:23:19.528 { 00:23:19.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:19.528 "dma_device_type": 2 00:23:19.528 } 00:23:19.528 ], 00:23:19.528 "driver_specific": { 00:23:19.528 "raid": { 00:23:19.528 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:19.528 "strip_size_kb": 0, 00:23:19.528 "state": "online", 00:23:19.528 "raid_level": "raid1", 00:23:19.528 "superblock": true, 00:23:19.528 "num_base_bdevs": 4, 00:23:19.528 "num_base_bdevs_discovered": 4, 00:23:19.528 "num_base_bdevs_operational": 4, 00:23:19.528 "base_bdevs_list": [ 00:23:19.528 { 00:23:19.528 "name": "pt1", 00:23:19.528 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:19.528 "is_configured": true, 00:23:19.528 "data_offset": 2048, 00:23:19.528 "data_size": 63488 00:23:19.528 }, 00:23:19.528 { 00:23:19.528 "name": "pt2", 00:23:19.528 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:19.528 "is_configured": true, 00:23:19.528 "data_offset": 2048, 00:23:19.528 "data_size": 63488 00:23:19.528 }, 00:23:19.528 { 00:23:19.528 "name": "pt3", 00:23:19.528 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:19.528 "is_configured": true, 00:23:19.528 "data_offset": 2048, 00:23:19.528 "data_size": 63488 00:23:19.528 }, 00:23:19.528 { 00:23:19.528 "name": "pt4", 00:23:19.528 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:19.528 "is_configured": true, 00:23:19.528 "data_offset": 2048, 00:23:19.528 "data_size": 63488 00:23:19.528 } 00:23:19.528 ] 00:23:19.528 } 00:23:19.528 } 00:23:19.528 }' 00:23:19.528 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:19.528 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:19.528 pt2 00:23:19.528 pt3 00:23:19.528 pt4' 00:23:19.528 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:19.528 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:19.528 20:37:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:19.787 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:19.787 "name": "pt1", 00:23:19.787 "aliases": [ 00:23:19.787 "00000000-0000-0000-0000-000000000001" 00:23:19.787 ], 00:23:19.787 "product_name": "passthru", 00:23:19.787 "block_size": 512, 00:23:19.787 "num_blocks": 65536, 00:23:19.787 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:19.787 "assigned_rate_limits": { 00:23:19.787 "rw_ios_per_sec": 0, 00:23:19.787 "rw_mbytes_per_sec": 0, 00:23:19.787 "r_mbytes_per_sec": 0, 00:23:19.787 "w_mbytes_per_sec": 0 00:23:19.787 }, 00:23:19.787 "claimed": true, 00:23:19.787 "claim_type": "exclusive_write", 00:23:19.787 "zoned": false, 00:23:19.787 "supported_io_types": { 00:23:19.787 "read": true, 00:23:19.787 "write": true, 00:23:19.787 "unmap": true, 00:23:19.787 "flush": true, 00:23:19.787 "reset": true, 00:23:19.787 "nvme_admin": false, 00:23:19.787 "nvme_io": false, 00:23:19.787 "nvme_io_md": false, 00:23:19.787 "write_zeroes": true, 00:23:19.787 "zcopy": true, 00:23:19.787 "get_zone_info": false, 00:23:19.787 "zone_management": false, 00:23:19.787 "zone_append": false, 00:23:19.787 "compare": false, 00:23:19.787 "compare_and_write": false, 00:23:19.787 "abort": true, 00:23:19.787 "seek_hole": false, 00:23:19.787 "seek_data": false, 00:23:19.787 "copy": true, 00:23:19.787 "nvme_iov_md": false 00:23:19.787 }, 00:23:19.787 "memory_domains": [ 00:23:19.787 { 00:23:19.787 "dma_device_id": "system", 00:23:19.787 "dma_device_type": 1 00:23:19.787 }, 00:23:19.787 { 00:23:19.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:19.787 "dma_device_type": 2 00:23:19.787 } 00:23:19.787 ], 00:23:19.787 "driver_specific": { 00:23:19.788 "passthru": { 00:23:19.788 "name": "pt1", 00:23:19.788 "base_bdev_name": "malloc1" 00:23:19.788 } 00:23:19.788 } 00:23:19.788 }' 00:23:19.788 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:19.788 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:20.047 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:20.047 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:20.047 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:20.047 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:20.047 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:20.047 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:20.047 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:20.047 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:20.047 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:20.305 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:20.305 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:20.305 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:20.305 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:20.564 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:20.564 "name": "pt2", 00:23:20.564 "aliases": [ 00:23:20.564 "00000000-0000-0000-0000-000000000002" 00:23:20.564 ], 00:23:20.564 "product_name": "passthru", 00:23:20.564 "block_size": 512, 00:23:20.564 "num_blocks": 65536, 00:23:20.564 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:20.564 "assigned_rate_limits": { 00:23:20.564 "rw_ios_per_sec": 0, 00:23:20.564 "rw_mbytes_per_sec": 0, 00:23:20.564 "r_mbytes_per_sec": 0, 00:23:20.564 "w_mbytes_per_sec": 0 00:23:20.564 }, 00:23:20.564 "claimed": true, 00:23:20.564 "claim_type": "exclusive_write", 00:23:20.564 "zoned": false, 00:23:20.564 "supported_io_types": { 00:23:20.564 "read": true, 00:23:20.564 "write": true, 00:23:20.564 "unmap": true, 00:23:20.564 "flush": true, 00:23:20.564 "reset": true, 00:23:20.564 "nvme_admin": false, 00:23:20.564 "nvme_io": false, 00:23:20.564 "nvme_io_md": false, 00:23:20.564 "write_zeroes": true, 00:23:20.564 "zcopy": true, 00:23:20.564 "get_zone_info": false, 00:23:20.564 "zone_management": false, 00:23:20.564 "zone_append": false, 00:23:20.564 "compare": false, 00:23:20.564 "compare_and_write": false, 00:23:20.564 "abort": true, 00:23:20.564 "seek_hole": false, 00:23:20.564 "seek_data": false, 00:23:20.564 "copy": true, 00:23:20.564 "nvme_iov_md": false 00:23:20.564 }, 00:23:20.564 "memory_domains": [ 00:23:20.564 { 00:23:20.564 "dma_device_id": "system", 00:23:20.564 "dma_device_type": 1 00:23:20.564 }, 00:23:20.564 { 00:23:20.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:20.564 "dma_device_type": 2 00:23:20.564 } 00:23:20.564 ], 00:23:20.564 "driver_specific": { 00:23:20.564 "passthru": { 00:23:20.564 "name": "pt2", 00:23:20.564 "base_bdev_name": "malloc2" 00:23:20.564 } 00:23:20.564 } 00:23:20.564 }' 00:23:20.564 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:20.564 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:20.564 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:20.564 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:20.564 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:20.564 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:20.564 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:20.564 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:20.822 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:20.822 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:20.822 20:37:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:20.822 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:20.822 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:20.822 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:20.822 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:21.080 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:21.080 "name": "pt3", 00:23:21.080 "aliases": [ 00:23:21.080 "00000000-0000-0000-0000-000000000003" 00:23:21.080 ], 00:23:21.080 "product_name": "passthru", 00:23:21.080 "block_size": 512, 00:23:21.080 "num_blocks": 65536, 00:23:21.080 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:21.080 "assigned_rate_limits": { 00:23:21.080 "rw_ios_per_sec": 0, 00:23:21.080 "rw_mbytes_per_sec": 0, 00:23:21.080 "r_mbytes_per_sec": 0, 00:23:21.080 "w_mbytes_per_sec": 0 00:23:21.080 }, 00:23:21.080 "claimed": true, 00:23:21.080 "claim_type": "exclusive_write", 00:23:21.080 "zoned": false, 00:23:21.080 "supported_io_types": { 00:23:21.080 "read": true, 00:23:21.080 "write": true, 00:23:21.080 "unmap": true, 00:23:21.080 "flush": true, 00:23:21.080 "reset": true, 00:23:21.080 "nvme_admin": false, 00:23:21.080 "nvme_io": false, 00:23:21.080 "nvme_io_md": false, 00:23:21.080 "write_zeroes": true, 00:23:21.080 "zcopy": true, 00:23:21.080 "get_zone_info": false, 00:23:21.080 "zone_management": false, 00:23:21.080 "zone_append": false, 00:23:21.080 "compare": false, 00:23:21.080 "compare_and_write": false, 00:23:21.080 "abort": true, 00:23:21.080 "seek_hole": false, 00:23:21.080 "seek_data": false, 00:23:21.080 "copy": true, 00:23:21.080 "nvme_iov_md": false 00:23:21.080 }, 00:23:21.081 "memory_domains": [ 00:23:21.081 { 00:23:21.081 "dma_device_id": "system", 00:23:21.081 "dma_device_type": 1 00:23:21.081 }, 00:23:21.081 { 00:23:21.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:21.081 "dma_device_type": 2 00:23:21.081 } 00:23:21.081 ], 00:23:21.081 "driver_specific": { 00:23:21.081 "passthru": { 00:23:21.081 "name": "pt3", 00:23:21.081 "base_bdev_name": "malloc3" 00:23:21.081 } 00:23:21.081 } 00:23:21.081 }' 00:23:21.081 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:21.081 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:21.081 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:21.081 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:21.081 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:21.339 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:21.339 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:21.339 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:21.339 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:21.339 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:21.339 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:21.339 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:21.339 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:21.339 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:21.339 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:21.599 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:21.599 "name": "pt4", 00:23:21.599 "aliases": [ 00:23:21.599 "00000000-0000-0000-0000-000000000004" 00:23:21.599 ], 00:23:21.599 "product_name": "passthru", 00:23:21.599 "block_size": 512, 00:23:21.599 "num_blocks": 65536, 00:23:21.599 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:21.599 "assigned_rate_limits": { 00:23:21.599 "rw_ios_per_sec": 0, 00:23:21.599 "rw_mbytes_per_sec": 0, 00:23:21.599 "r_mbytes_per_sec": 0, 00:23:21.599 "w_mbytes_per_sec": 0 00:23:21.599 }, 00:23:21.599 "claimed": true, 00:23:21.599 "claim_type": "exclusive_write", 00:23:21.599 "zoned": false, 00:23:21.599 "supported_io_types": { 00:23:21.599 "read": true, 00:23:21.599 "write": true, 00:23:21.599 "unmap": true, 00:23:21.599 "flush": true, 00:23:21.599 "reset": true, 00:23:21.599 "nvme_admin": false, 00:23:21.599 "nvme_io": false, 00:23:21.599 "nvme_io_md": false, 00:23:21.599 "write_zeroes": true, 00:23:21.599 "zcopy": true, 00:23:21.599 "get_zone_info": false, 00:23:21.599 "zone_management": false, 00:23:21.599 "zone_append": false, 00:23:21.599 "compare": false, 00:23:21.599 "compare_and_write": false, 00:23:21.599 "abort": true, 00:23:21.599 "seek_hole": false, 00:23:21.599 "seek_data": false, 00:23:21.599 "copy": true, 00:23:21.599 "nvme_iov_md": false 00:23:21.599 }, 00:23:21.599 "memory_domains": [ 00:23:21.599 { 00:23:21.599 "dma_device_id": "system", 00:23:21.599 "dma_device_type": 1 00:23:21.599 }, 00:23:21.599 { 00:23:21.599 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:21.599 "dma_device_type": 2 00:23:21.599 } 00:23:21.599 ], 00:23:21.599 "driver_specific": { 00:23:21.599 "passthru": { 00:23:21.599 "name": "pt4", 00:23:21.599 "base_bdev_name": "malloc4" 00:23:21.599 } 00:23:21.599 } 00:23:21.599 }' 00:23:21.599 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:21.599 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:21.599 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:21.599 20:37:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:21.877 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:21.877 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:21.878 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:21.878 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:21.878 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:21.878 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:21.878 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:21.878 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:21.878 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:21.878 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:22.141 [2024-07-15 20:37:14.404653] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:22.141 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' f1c94521-4807-42e5-811c-5e21c70468a1 '!=' f1c94521-4807-42e5-811c-5e21c70468a1 ']' 00:23:22.141 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:22.141 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:22.141 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:22.141 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:22.420 [2024-07-15 20:37:14.657048] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.420 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:22.679 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:22.679 "name": "raid_bdev1", 00:23:22.679 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:22.679 "strip_size_kb": 0, 00:23:22.679 "state": "online", 00:23:22.679 "raid_level": "raid1", 00:23:22.679 "superblock": true, 00:23:22.679 "num_base_bdevs": 4, 00:23:22.679 "num_base_bdevs_discovered": 3, 00:23:22.679 "num_base_bdevs_operational": 3, 00:23:22.679 "base_bdevs_list": [ 00:23:22.679 { 00:23:22.679 "name": null, 00:23:22.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.679 "is_configured": false, 00:23:22.679 "data_offset": 2048, 00:23:22.679 "data_size": 63488 00:23:22.679 }, 00:23:22.679 { 00:23:22.679 "name": "pt2", 00:23:22.679 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:22.679 "is_configured": true, 00:23:22.679 "data_offset": 2048, 00:23:22.679 "data_size": 63488 00:23:22.679 }, 00:23:22.679 { 00:23:22.679 "name": "pt3", 00:23:22.679 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:22.679 "is_configured": true, 00:23:22.679 "data_offset": 2048, 00:23:22.679 "data_size": 63488 00:23:22.679 }, 00:23:22.679 { 00:23:22.679 "name": "pt4", 00:23:22.679 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:22.679 "is_configured": true, 00:23:22.679 "data_offset": 2048, 00:23:22.679 "data_size": 63488 00:23:22.679 } 00:23:22.679 ] 00:23:22.679 }' 00:23:22.679 20:37:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:22.679 20:37:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:23.245 20:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:23.503 [2024-07-15 20:37:15.739903] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:23.503 [2024-07-15 20:37:15.739934] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:23.503 [2024-07-15 20:37:15.739980] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:23.503 [2024-07-15 20:37:15.740042] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:23.503 [2024-07-15 20:37:15.740053] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1afd780 name raid_bdev1, state offline 00:23:23.503 20:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.503 20:37:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:23:23.761 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:23:23.761 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:23:23.761 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:23:23.761 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:23.761 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:24.019 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:24.019 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:24.019 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:24.278 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:24.278 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:24.278 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:24.537 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:24.537 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:24.537 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:23:24.537 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:24.537 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:24.795 [2024-07-15 20:37:16.975101] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:24.795 [2024-07-15 20:37:16.975148] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:24.795 [2024-07-15 20:37:16.975169] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ca0700 00:23:24.795 [2024-07-15 20:37:16.975181] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:24.795 [2024-07-15 20:37:16.976813] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:24.795 [2024-07-15 20:37:16.976841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:24.796 [2024-07-15 20:37:16.976909] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:24.796 [2024-07-15 20:37:16.976948] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:24.796 pt2 00:23:24.796 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:24.796 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:24.796 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:24.796 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.796 20:37:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.796 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:24.796 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.796 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.796 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.796 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.796 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.796 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:25.054 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:25.054 "name": "raid_bdev1", 00:23:25.054 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:25.054 "strip_size_kb": 0, 00:23:25.054 "state": "configuring", 00:23:25.054 "raid_level": "raid1", 00:23:25.054 "superblock": true, 00:23:25.054 "num_base_bdevs": 4, 00:23:25.054 "num_base_bdevs_discovered": 1, 00:23:25.054 "num_base_bdevs_operational": 3, 00:23:25.054 "base_bdevs_list": [ 00:23:25.054 { 00:23:25.054 "name": null, 00:23:25.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:25.054 "is_configured": false, 00:23:25.054 "data_offset": 2048, 00:23:25.054 "data_size": 63488 00:23:25.054 }, 00:23:25.054 { 00:23:25.054 "name": "pt2", 00:23:25.054 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:25.054 "is_configured": true, 00:23:25.054 "data_offset": 2048, 00:23:25.054 "data_size": 63488 00:23:25.054 }, 00:23:25.054 { 00:23:25.054 "name": null, 00:23:25.054 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:25.054 "is_configured": false, 00:23:25.054 "data_offset": 2048, 00:23:25.054 "data_size": 63488 00:23:25.054 }, 00:23:25.054 { 00:23:25.054 "name": null, 00:23:25.054 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:25.054 "is_configured": false, 00:23:25.054 "data_offset": 2048, 00:23:25.054 "data_size": 63488 00:23:25.054 } 00:23:25.054 ] 00:23:25.054 }' 00:23:25.054 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:25.054 20:37:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:25.620 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:23:25.620 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:25.621 20:37:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:25.621 [2024-07-15 20:37:17.989795] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:25.621 [2024-07-15 20:37:17.989841] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:25.621 [2024-07-15 20:37:17.989862] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b06a10 00:23:25.621 [2024-07-15 20:37:17.989875] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:25.621 [2024-07-15 20:37:17.990216] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:25.621 [2024-07-15 20:37:17.990233] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:25.621 [2024-07-15 20:37:17.990289] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:25.621 [2024-07-15 20:37:17.990307] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:25.621 pt3 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.879 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.137 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.137 "name": "raid_bdev1", 00:23:26.137 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:26.137 "strip_size_kb": 0, 00:23:26.137 "state": "configuring", 00:23:26.137 "raid_level": "raid1", 00:23:26.137 "superblock": true, 00:23:26.137 "num_base_bdevs": 4, 00:23:26.137 "num_base_bdevs_discovered": 2, 00:23:26.137 "num_base_bdevs_operational": 3, 00:23:26.137 "base_bdevs_list": [ 00:23:26.137 { 00:23:26.137 "name": null, 00:23:26.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:26.138 "is_configured": false, 00:23:26.138 "data_offset": 2048, 00:23:26.138 "data_size": 63488 00:23:26.138 }, 00:23:26.138 { 00:23:26.138 "name": "pt2", 00:23:26.138 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:26.138 "is_configured": true, 00:23:26.138 "data_offset": 2048, 00:23:26.138 "data_size": 63488 00:23:26.138 }, 00:23:26.138 { 00:23:26.138 "name": "pt3", 00:23:26.138 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:26.138 "is_configured": true, 00:23:26.138 "data_offset": 2048, 00:23:26.138 "data_size": 63488 00:23:26.138 }, 00:23:26.138 { 00:23:26.138 "name": null, 00:23:26.138 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:26.138 "is_configured": false, 00:23:26.138 "data_offset": 2048, 00:23:26.138 "data_size": 63488 00:23:26.138 } 00:23:26.138 ] 00:23:26.138 }' 00:23:26.138 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.138 20:37:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:26.704 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:23:26.704 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:26.704 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:23:26.704 20:37:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:26.963 [2024-07-15 20:37:19.096736] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:26.963 [2024-07-15 20:37:19.096784] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.963 [2024-07-15 20:37:19.096804] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ca9520 00:23:26.963 [2024-07-15 20:37:19.096816] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.963 [2024-07-15 20:37:19.097180] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.963 [2024-07-15 20:37:19.097197] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:26.963 [2024-07-15 20:37:19.097265] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:26.963 [2024-07-15 20:37:19.097285] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:26.963 [2024-07-15 20:37:19.097398] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1afdea0 00:23:26.963 [2024-07-15 20:37:19.097409] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:26.963 [2024-07-15 20:37:19.097580] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b02600 00:23:26.963 [2024-07-15 20:37:19.097712] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1afdea0 00:23:26.963 [2024-07-15 20:37:19.097722] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1afdea0 00:23:26.963 [2024-07-15 20:37:19.097819] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:26.963 pt4 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.963 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.221 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.221 "name": "raid_bdev1", 00:23:27.221 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:27.221 "strip_size_kb": 0, 00:23:27.221 "state": "online", 00:23:27.221 "raid_level": "raid1", 00:23:27.221 "superblock": true, 00:23:27.221 "num_base_bdevs": 4, 00:23:27.221 "num_base_bdevs_discovered": 3, 00:23:27.221 "num_base_bdevs_operational": 3, 00:23:27.221 "base_bdevs_list": [ 00:23:27.221 { 00:23:27.221 "name": null, 00:23:27.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.221 "is_configured": false, 00:23:27.221 "data_offset": 2048, 00:23:27.221 "data_size": 63488 00:23:27.221 }, 00:23:27.221 { 00:23:27.221 "name": "pt2", 00:23:27.221 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:27.221 "is_configured": true, 00:23:27.221 "data_offset": 2048, 00:23:27.221 "data_size": 63488 00:23:27.221 }, 00:23:27.221 { 00:23:27.221 "name": "pt3", 00:23:27.221 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:27.221 "is_configured": true, 00:23:27.221 "data_offset": 2048, 00:23:27.221 "data_size": 63488 00:23:27.221 }, 00:23:27.221 { 00:23:27.221 "name": "pt4", 00:23:27.221 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:27.221 "is_configured": true, 00:23:27.221 "data_offset": 2048, 00:23:27.221 "data_size": 63488 00:23:27.221 } 00:23:27.221 ] 00:23:27.221 }' 00:23:27.221 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.221 20:37:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:27.787 20:37:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:28.045 [2024-07-15 20:37:20.199778] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:28.045 [2024-07-15 20:37:20.199807] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:28.045 [2024-07-15 20:37:20.199860] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:28.045 [2024-07-15 20:37:20.199933] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:28.045 [2024-07-15 20:37:20.199946] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1afdea0 name raid_bdev1, state offline 00:23:28.045 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.045 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:23:28.304 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:23:28.304 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:23:28.304 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:23:28.304 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:23:28.304 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:28.561 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:28.819 [2024-07-15 20:37:20.953742] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:28.819 [2024-07-15 20:37:20.953787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:28.819 [2024-07-15 20:37:20.953806] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ca9520 00:23:28.819 [2024-07-15 20:37:20.953818] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:28.819 [2024-07-15 20:37:20.955450] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:28.819 [2024-07-15 20:37:20.955480] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:28.819 [2024-07-15 20:37:20.955551] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:28.819 [2024-07-15 20:37:20.955579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:28.819 [2024-07-15 20:37:20.955685] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:28.819 [2024-07-15 20:37:20.955698] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:28.819 [2024-07-15 20:37:20.955712] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1afd060 name raid_bdev1, state configuring 00:23:28.819 [2024-07-15 20:37:20.955735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:28.819 [2024-07-15 20:37:20.955810] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:28.819 pt1 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.819 20:37:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:29.076 20:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:29.076 "name": "raid_bdev1", 00:23:29.076 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:29.076 "strip_size_kb": 0, 00:23:29.076 "state": "configuring", 00:23:29.076 "raid_level": "raid1", 00:23:29.076 "superblock": true, 00:23:29.076 "num_base_bdevs": 4, 00:23:29.076 "num_base_bdevs_discovered": 2, 00:23:29.076 "num_base_bdevs_operational": 3, 00:23:29.076 "base_bdevs_list": [ 00:23:29.076 { 00:23:29.076 "name": null, 00:23:29.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:29.076 "is_configured": false, 00:23:29.076 "data_offset": 2048, 00:23:29.076 "data_size": 63488 00:23:29.076 }, 00:23:29.076 { 00:23:29.076 "name": "pt2", 00:23:29.076 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:29.076 "is_configured": true, 00:23:29.076 "data_offset": 2048, 00:23:29.076 "data_size": 63488 00:23:29.076 }, 00:23:29.076 { 00:23:29.076 "name": "pt3", 00:23:29.076 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:29.076 "is_configured": true, 00:23:29.076 "data_offset": 2048, 00:23:29.076 "data_size": 63488 00:23:29.076 }, 00:23:29.076 { 00:23:29.076 "name": null, 00:23:29.076 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:29.076 "is_configured": false, 00:23:29.076 "data_offset": 2048, 00:23:29.076 "data_size": 63488 00:23:29.076 } 00:23:29.076 ] 00:23:29.076 }' 00:23:29.076 20:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:29.076 20:37:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:29.640 20:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:23:29.640 20:37:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:29.898 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:23:29.898 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:30.156 [2024-07-15 20:37:22.301332] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:30.156 [2024-07-15 20:37:22.301384] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.156 [2024-07-15 20:37:22.301403] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1afd310 00:23:30.156 [2024-07-15 20:37:22.301416] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.156 [2024-07-15 20:37:22.301783] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.156 [2024-07-15 20:37:22.301801] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:30.156 [2024-07-15 20:37:22.301868] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:30.156 [2024-07-15 20:37:22.301888] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:30.156 [2024-07-15 20:37:22.302018] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b00b40 00:23:30.156 [2024-07-15 20:37:22.302029] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:30.156 [2024-07-15 20:37:22.302208] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ca0990 00:23:30.156 [2024-07-15 20:37:22.302343] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b00b40 00:23:30.156 [2024-07-15 20:37:22.302352] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b00b40 00:23:30.156 [2024-07-15 20:37:22.302460] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.156 pt4 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.156 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.414 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:30.414 "name": "raid_bdev1", 00:23:30.414 "uuid": "f1c94521-4807-42e5-811c-5e21c70468a1", 00:23:30.414 "strip_size_kb": 0, 00:23:30.414 "state": "online", 00:23:30.414 "raid_level": "raid1", 00:23:30.414 "superblock": true, 00:23:30.414 "num_base_bdevs": 4, 00:23:30.414 "num_base_bdevs_discovered": 3, 00:23:30.414 "num_base_bdevs_operational": 3, 00:23:30.414 "base_bdevs_list": [ 00:23:30.414 { 00:23:30.414 "name": null, 00:23:30.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.414 "is_configured": false, 00:23:30.414 "data_offset": 2048, 00:23:30.414 "data_size": 63488 00:23:30.414 }, 00:23:30.414 { 00:23:30.414 "name": "pt2", 00:23:30.414 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:30.414 "is_configured": true, 00:23:30.414 "data_offset": 2048, 00:23:30.414 "data_size": 63488 00:23:30.414 }, 00:23:30.414 { 00:23:30.414 "name": "pt3", 00:23:30.414 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:30.414 "is_configured": true, 00:23:30.414 "data_offset": 2048, 00:23:30.414 "data_size": 63488 00:23:30.414 }, 00:23:30.414 { 00:23:30.414 "name": "pt4", 00:23:30.414 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:30.414 "is_configured": true, 00:23:30.414 "data_offset": 2048, 00:23:30.414 "data_size": 63488 00:23:30.414 } 00:23:30.414 ] 00:23:30.414 }' 00:23:30.414 20:37:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:30.414 20:37:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:31.007 20:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:31.007 20:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:31.265 20:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:23:31.265 20:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:31.265 20:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:23:31.265 [2024-07-15 20:37:23.637201] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' f1c94521-4807-42e5-811c-5e21c70468a1 '!=' f1c94521-4807-42e5-811c-5e21c70468a1 ']' 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1454717 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1454717 ']' 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1454717 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1454717 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1454717' 00:23:31.524 killing process with pid 1454717 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1454717 00:23:31.524 [2024-07-15 20:37:23.728267] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:31.524 [2024-07-15 20:37:23.728332] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:31.524 [2024-07-15 20:37:23.728402] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:31.524 [2024-07-15 20:37:23.728414] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b00b40 name raid_bdev1, state offline 00:23:31.524 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1454717 00:23:31.524 [2024-07-15 20:37:23.770605] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:31.782 20:37:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:23:31.782 00:23:31.782 real 0m25.539s 00:23:31.782 user 0m46.679s 00:23:31.782 sys 0m4.663s 00:23:31.782 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:31.782 20:37:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:31.782 ************************************ 00:23:31.782 END TEST raid_superblock_test 00:23:31.782 ************************************ 00:23:31.782 20:37:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:31.782 20:37:24 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:23:31.782 20:37:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:31.782 20:37:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:31.782 20:37:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:31.782 ************************************ 00:23:31.782 START TEST raid_read_error_test 00:23:31.782 ************************************ 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.TPFixNyZGa 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1458568 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1458568 /var/tmp/spdk-raid.sock 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1458568 ']' 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:31.782 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:31.782 20:37:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:32.040 [2024-07-15 20:37:24.165754] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:23:32.040 [2024-07-15 20:37:24.165824] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1458568 ] 00:23:32.040 [2024-07-15 20:37:24.284794] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:32.040 [2024-07-15 20:37:24.390965] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:32.298 [2024-07-15 20:37:24.463318] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:32.298 [2024-07-15 20:37:24.463355] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:32.863 20:37:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:32.863 20:37:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:23:32.863 20:37:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:32.863 20:37:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:33.122 BaseBdev1_malloc 00:23:33.122 20:37:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:33.381 true 00:23:33.381 20:37:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:33.639 [2024-07-15 20:37:25.844419] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:33.639 [2024-07-15 20:37:25.844465] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:33.639 [2024-07-15 20:37:25.844492] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d20d0 00:23:33.639 [2024-07-15 20:37:25.844505] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:33.639 [2024-07-15 20:37:25.846403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:33.639 [2024-07-15 20:37:25.846436] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:33.639 BaseBdev1 00:23:33.639 20:37:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:33.639 20:37:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:33.921 BaseBdev2_malloc 00:23:33.921 20:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:34.179 true 00:23:34.179 20:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:34.438 [2024-07-15 20:37:26.580165] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:34.438 [2024-07-15 20:37:26.580209] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:34.438 [2024-07-15 20:37:26.580237] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d6910 00:23:34.438 [2024-07-15 20:37:26.580250] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:34.438 [2024-07-15 20:37:26.581883] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:34.438 [2024-07-15 20:37:26.581911] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:34.438 BaseBdev2 00:23:34.438 20:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:34.438 20:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:34.702 BaseBdev3_malloc 00:23:34.702 20:37:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:34.966 true 00:23:34.966 20:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:34.966 [2024-07-15 20:37:27.310702] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:34.966 [2024-07-15 20:37:27.310746] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:34.966 [2024-07-15 20:37:27.310771] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d8bd0 00:23:34.966 [2024-07-15 20:37:27.310785] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:34.966 [2024-07-15 20:37:27.312360] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:34.966 [2024-07-15 20:37:27.312388] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:34.966 BaseBdev3 00:23:34.966 20:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:34.966 20:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:35.224 BaseBdev4_malloc 00:23:35.224 20:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:35.483 true 00:23:35.483 20:37:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:35.741 [2024-07-15 20:37:28.041200] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:35.741 [2024-07-15 20:37:28.041244] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:35.741 [2024-07-15 20:37:28.041273] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d9aa0 00:23:35.741 [2024-07-15 20:37:28.041291] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:35.741 [2024-07-15 20:37:28.042913] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:35.741 [2024-07-15 20:37:28.042949] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:35.741 BaseBdev4 00:23:35.741 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:36.068 [2024-07-15 20:37:28.285877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:36.068 [2024-07-15 20:37:28.287264] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:36.068 [2024-07-15 20:37:28.287335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:36.068 [2024-07-15 20:37:28.287398] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:36.068 [2024-07-15 20:37:28.287640] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23d3c20 00:23:36.068 [2024-07-15 20:37:28.287651] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:36.068 [2024-07-15 20:37:28.287852] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2228260 00:23:36.068 [2024-07-15 20:37:28.288019] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23d3c20 00:23:36.068 [2024-07-15 20:37:28.288044] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23d3c20 00:23:36.068 [2024-07-15 20:37:28.288156] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:36.068 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:36.068 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:36.068 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:36.068 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:36.069 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:36.069 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:36.069 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.069 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.069 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.069 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.069 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.069 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.355 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.356 "name": "raid_bdev1", 00:23:36.356 "uuid": "4261054e-ff9b-4a50-b391-971a52e0c69b", 00:23:36.356 "strip_size_kb": 0, 00:23:36.356 "state": "online", 00:23:36.356 "raid_level": "raid1", 00:23:36.356 "superblock": true, 00:23:36.356 "num_base_bdevs": 4, 00:23:36.356 "num_base_bdevs_discovered": 4, 00:23:36.356 "num_base_bdevs_operational": 4, 00:23:36.356 "base_bdevs_list": [ 00:23:36.356 { 00:23:36.356 "name": "BaseBdev1", 00:23:36.356 "uuid": "5bc1841d-b1d5-5408-a7cc-4648d4e7c9ec", 00:23:36.356 "is_configured": true, 00:23:36.356 "data_offset": 2048, 00:23:36.356 "data_size": 63488 00:23:36.356 }, 00:23:36.356 { 00:23:36.356 "name": "BaseBdev2", 00:23:36.356 "uuid": "ae988ff2-0567-5682-9af4-d8fe8227295c", 00:23:36.356 "is_configured": true, 00:23:36.356 "data_offset": 2048, 00:23:36.356 "data_size": 63488 00:23:36.356 }, 00:23:36.356 { 00:23:36.356 "name": "BaseBdev3", 00:23:36.356 "uuid": "287fc0d4-874d-50df-b949-601a0d427491", 00:23:36.356 "is_configured": true, 00:23:36.356 "data_offset": 2048, 00:23:36.356 "data_size": 63488 00:23:36.356 }, 00:23:36.356 { 00:23:36.356 "name": "BaseBdev4", 00:23:36.356 "uuid": "3468e768-4f8a-554e-854a-0da443577b67", 00:23:36.356 "is_configured": true, 00:23:36.356 "data_offset": 2048, 00:23:36.356 "data_size": 63488 00:23:36.356 } 00:23:36.356 ] 00:23:36.356 }' 00:23:36.356 20:37:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.356 20:37:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:36.922 20:37:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:36.922 20:37:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:36.922 [2024-07-15 20:37:29.300834] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2227c60 00:23:37.858 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:23:38.116 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:38.116 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:38.116 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:23:38.116 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:23:38.116 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:38.116 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:38.116 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:38.117 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:38.117 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:38.117 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:38.117 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:38.117 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:38.117 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:38.117 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:38.117 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.117 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.375 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:38.375 "name": "raid_bdev1", 00:23:38.375 "uuid": "4261054e-ff9b-4a50-b391-971a52e0c69b", 00:23:38.375 "strip_size_kb": 0, 00:23:38.375 "state": "online", 00:23:38.375 "raid_level": "raid1", 00:23:38.375 "superblock": true, 00:23:38.375 "num_base_bdevs": 4, 00:23:38.375 "num_base_bdevs_discovered": 4, 00:23:38.375 "num_base_bdevs_operational": 4, 00:23:38.375 "base_bdevs_list": [ 00:23:38.375 { 00:23:38.375 "name": "BaseBdev1", 00:23:38.375 "uuid": "5bc1841d-b1d5-5408-a7cc-4648d4e7c9ec", 00:23:38.375 "is_configured": true, 00:23:38.375 "data_offset": 2048, 00:23:38.375 "data_size": 63488 00:23:38.375 }, 00:23:38.375 { 00:23:38.375 "name": "BaseBdev2", 00:23:38.375 "uuid": "ae988ff2-0567-5682-9af4-d8fe8227295c", 00:23:38.375 "is_configured": true, 00:23:38.375 "data_offset": 2048, 00:23:38.375 "data_size": 63488 00:23:38.375 }, 00:23:38.375 { 00:23:38.375 "name": "BaseBdev3", 00:23:38.375 "uuid": "287fc0d4-874d-50df-b949-601a0d427491", 00:23:38.375 "is_configured": true, 00:23:38.375 "data_offset": 2048, 00:23:38.375 "data_size": 63488 00:23:38.375 }, 00:23:38.375 { 00:23:38.375 "name": "BaseBdev4", 00:23:38.375 "uuid": "3468e768-4f8a-554e-854a-0da443577b67", 00:23:38.375 "is_configured": true, 00:23:38.375 "data_offset": 2048, 00:23:38.375 "data_size": 63488 00:23:38.375 } 00:23:38.375 ] 00:23:38.375 }' 00:23:38.375 20:37:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:38.375 20:37:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:39.312 [2024-07-15 20:37:31.550838] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:39.312 [2024-07-15 20:37:31.550874] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:39.312 [2024-07-15 20:37:31.554153] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:39.312 [2024-07-15 20:37:31.554191] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.312 [2024-07-15 20:37:31.554311] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:39.312 [2024-07-15 20:37:31.554322] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23d3c20 name raid_bdev1, state offline 00:23:39.312 0 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1458568 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1458568 ']' 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1458568 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1458568 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1458568' 00:23:39.312 killing process with pid 1458568 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1458568 00:23:39.312 [2024-07-15 20:37:31.616892] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:39.312 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1458568 00:23:39.312 [2024-07-15 20:37:31.646838] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:39.570 20:37:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.TPFixNyZGa 00:23:39.570 20:37:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:39.570 20:37:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:39.570 20:37:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:39.570 20:37:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:39.570 20:37:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:39.571 20:37:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:39.571 20:37:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:39.571 00:23:39.571 real 0m7.788s 00:23:39.571 user 0m12.455s 00:23:39.571 sys 0m1.436s 00:23:39.571 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:39.571 20:37:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:39.571 ************************************ 00:23:39.571 END TEST raid_read_error_test 00:23:39.571 ************************************ 00:23:39.571 20:37:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:39.571 20:37:31 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:23:39.571 20:37:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:39.571 20:37:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:39.571 20:37:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:39.829 ************************************ 00:23:39.829 START TEST raid_write_error_test 00:23:39.829 ************************************ 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.vsQoQNBHKJ 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1459638 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1459638 /var/tmp/spdk-raid.sock 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1459638 ']' 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:39.829 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:39.829 20:37:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:39.829 [2024-07-15 20:37:32.036838] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:23:39.830 [2024-07-15 20:37:32.036903] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1459638 ] 00:23:39.830 [2024-07-15 20:37:32.165330] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:40.088 [2024-07-15 20:37:32.275032] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:40.088 [2024-07-15 20:37:32.337755] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:40.088 [2024-07-15 20:37:32.337788] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:40.347 20:37:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:40.347 20:37:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:23:40.347 20:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:40.347 20:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:40.605 BaseBdev1_malloc 00:23:40.605 20:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:40.605 true 00:23:40.863 20:37:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:40.863 [2024-07-15 20:37:33.219502] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:40.863 [2024-07-15 20:37:33.219547] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:40.863 [2024-07-15 20:37:33.219568] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17130d0 00:23:40.863 [2024-07-15 20:37:33.219581] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:40.863 [2024-07-15 20:37:33.221458] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:40.863 [2024-07-15 20:37:33.221488] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:40.863 BaseBdev1 00:23:40.863 20:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:40.863 20:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:41.121 BaseBdev2_malloc 00:23:41.121 20:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:41.378 true 00:23:41.378 20:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:41.636 [2024-07-15 20:37:33.958021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:41.636 [2024-07-15 20:37:33.958067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:41.636 [2024-07-15 20:37:33.958088] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1717910 00:23:41.636 [2024-07-15 20:37:33.958101] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:41.636 [2024-07-15 20:37:33.959734] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:41.637 [2024-07-15 20:37:33.959762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:41.637 BaseBdev2 00:23:41.637 20:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:41.637 20:37:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:41.895 BaseBdev3_malloc 00:23:41.895 20:37:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:42.153 true 00:23:42.153 20:37:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:42.411 [2024-07-15 20:37:34.673697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:42.411 [2024-07-15 20:37:34.673742] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:42.411 [2024-07-15 20:37:34.673765] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1719bd0 00:23:42.411 [2024-07-15 20:37:34.673777] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:42.411 [2024-07-15 20:37:34.675427] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:42.411 [2024-07-15 20:37:34.675461] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:42.411 BaseBdev3 00:23:42.411 20:37:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:42.411 20:37:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:42.669 BaseBdev4_malloc 00:23:42.669 20:37:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:42.927 true 00:23:42.927 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:43.185 [2024-07-15 20:37:35.404226] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:43.186 [2024-07-15 20:37:35.404270] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:43.186 [2024-07-15 20:37:35.404291] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x171aaa0 00:23:43.186 [2024-07-15 20:37:35.404303] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:43.186 [2024-07-15 20:37:35.405894] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:43.186 [2024-07-15 20:37:35.405921] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:43.186 BaseBdev4 00:23:43.186 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:43.444 [2024-07-15 20:37:35.644896] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:43.444 [2024-07-15 20:37:35.646232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:43.444 [2024-07-15 20:37:35.646300] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:43.444 [2024-07-15 20:37:35.646362] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:43.444 [2024-07-15 20:37:35.646593] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1714c20 00:23:43.444 [2024-07-15 20:37:35.646604] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:43.444 [2024-07-15 20:37:35.646802] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1569260 00:23:43.444 [2024-07-15 20:37:35.646969] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1714c20 00:23:43.444 [2024-07-15 20:37:35.646980] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1714c20 00:23:43.444 [2024-07-15 20:37:35.647088] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:43.444 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.445 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.703 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:43.703 "name": "raid_bdev1", 00:23:43.703 "uuid": "dd679a73-b50a-470a-a79e-95bd9b73e184", 00:23:43.703 "strip_size_kb": 0, 00:23:43.703 "state": "online", 00:23:43.703 "raid_level": "raid1", 00:23:43.703 "superblock": true, 00:23:43.703 "num_base_bdevs": 4, 00:23:43.703 "num_base_bdevs_discovered": 4, 00:23:43.703 "num_base_bdevs_operational": 4, 00:23:43.703 "base_bdevs_list": [ 00:23:43.703 { 00:23:43.703 "name": "BaseBdev1", 00:23:43.703 "uuid": "fde73da3-bbef-5e06-b0fa-cb1a6d510856", 00:23:43.703 "is_configured": true, 00:23:43.703 "data_offset": 2048, 00:23:43.703 "data_size": 63488 00:23:43.703 }, 00:23:43.703 { 00:23:43.703 "name": "BaseBdev2", 00:23:43.703 "uuid": "12ca61ae-bc02-5427-8ac2-6f0b0c42af89", 00:23:43.703 "is_configured": true, 00:23:43.703 "data_offset": 2048, 00:23:43.703 "data_size": 63488 00:23:43.703 }, 00:23:43.703 { 00:23:43.703 "name": "BaseBdev3", 00:23:43.703 "uuid": "575d7230-80a6-5968-bba9-7a0b3960eb86", 00:23:43.703 "is_configured": true, 00:23:43.703 "data_offset": 2048, 00:23:43.703 "data_size": 63488 00:23:43.703 }, 00:23:43.703 { 00:23:43.703 "name": "BaseBdev4", 00:23:43.703 "uuid": "024f72a5-ad37-518f-8623-231b615f28b6", 00:23:43.703 "is_configured": true, 00:23:43.703 "data_offset": 2048, 00:23:43.703 "data_size": 63488 00:23:43.703 } 00:23:43.703 ] 00:23:43.704 }' 00:23:43.704 20:37:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:43.704 20:37:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:44.270 20:37:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:44.270 20:37:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:44.270 [2024-07-15 20:37:36.607727] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1568c60 00:23:45.204 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:23:45.463 [2024-07-15 20:37:37.731418] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:23:45.463 [2024-07-15 20:37:37.731479] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:45.463 [2024-07-15 20:37:37.731692] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1568c60 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.463 20:37:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.721 20:37:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:45.721 "name": "raid_bdev1", 00:23:45.721 "uuid": "dd679a73-b50a-470a-a79e-95bd9b73e184", 00:23:45.721 "strip_size_kb": 0, 00:23:45.721 "state": "online", 00:23:45.721 "raid_level": "raid1", 00:23:45.721 "superblock": true, 00:23:45.721 "num_base_bdevs": 4, 00:23:45.721 "num_base_bdevs_discovered": 3, 00:23:45.721 "num_base_bdevs_operational": 3, 00:23:45.721 "base_bdevs_list": [ 00:23:45.721 { 00:23:45.721 "name": null, 00:23:45.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:45.721 "is_configured": false, 00:23:45.721 "data_offset": 2048, 00:23:45.721 "data_size": 63488 00:23:45.721 }, 00:23:45.721 { 00:23:45.721 "name": "BaseBdev2", 00:23:45.721 "uuid": "12ca61ae-bc02-5427-8ac2-6f0b0c42af89", 00:23:45.721 "is_configured": true, 00:23:45.721 "data_offset": 2048, 00:23:45.721 "data_size": 63488 00:23:45.721 }, 00:23:45.721 { 00:23:45.721 "name": "BaseBdev3", 00:23:45.721 "uuid": "575d7230-80a6-5968-bba9-7a0b3960eb86", 00:23:45.721 "is_configured": true, 00:23:45.721 "data_offset": 2048, 00:23:45.721 "data_size": 63488 00:23:45.721 }, 00:23:45.721 { 00:23:45.721 "name": "BaseBdev4", 00:23:45.721 "uuid": "024f72a5-ad37-518f-8623-231b615f28b6", 00:23:45.721 "is_configured": true, 00:23:45.721 "data_offset": 2048, 00:23:45.721 "data_size": 63488 00:23:45.721 } 00:23:45.721 ] 00:23:45.721 }' 00:23:45.721 20:37:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:45.721 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:46.288 20:37:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:46.546 [2024-07-15 20:37:38.782038] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:46.546 [2024-07-15 20:37:38.782074] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:46.546 [2024-07-15 20:37:38.785193] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:46.546 [2024-07-15 20:37:38.785228] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:46.546 [2024-07-15 20:37:38.785325] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:46.546 [2024-07-15 20:37:38.785336] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1714c20 name raid_bdev1, state offline 00:23:46.546 0 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1459638 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1459638 ']' 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1459638 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1459638 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1459638' 00:23:46.546 killing process with pid 1459638 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1459638 00:23:46.546 [2024-07-15 20:37:38.866257] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:46.546 20:37:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1459638 00:23:46.546 [2024-07-15 20:37:38.898771] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:46.804 20:37:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.vsQoQNBHKJ 00:23:46.804 20:37:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:46.804 20:37:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:46.804 20:37:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:46.804 20:37:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:46.804 20:37:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:46.804 20:37:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:46.804 20:37:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:46.804 00:23:46.804 real 0m7.182s 00:23:46.804 user 0m11.863s 00:23:46.804 sys 0m1.325s 00:23:46.804 20:37:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:46.804 20:37:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:46.804 ************************************ 00:23:46.804 END TEST raid_write_error_test 00:23:46.804 ************************************ 00:23:47.062 20:37:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:47.062 20:37:39 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:23:47.062 20:37:39 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:23:47.062 20:37:39 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:23:47.062 20:37:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:47.062 20:37:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:47.062 20:37:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:47.062 ************************************ 00:23:47.062 START TEST raid_rebuild_test 00:23:47.062 ************************************ 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1460698 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1460698 /var/tmp/spdk-raid.sock 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 1460698 ']' 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:47.062 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:47.062 20:37:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:47.062 [2024-07-15 20:37:39.292436] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:23:47.062 [2024-07-15 20:37:39.292489] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1460698 ] 00:23:47.062 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:47.062 Zero copy mechanism will not be used. 00:23:47.062 [2024-07-15 20:37:39.405881] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:47.320 [2024-07-15 20:37:39.509474] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:47.320 [2024-07-15 20:37:39.569480] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:47.320 [2024-07-15 20:37:39.569516] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:47.884 20:37:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:47.884 20:37:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:23:47.884 20:37:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:47.884 20:37:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:48.142 BaseBdev1_malloc 00:23:48.142 20:37:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:48.400 [2024-07-15 20:37:40.593394] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:48.400 [2024-07-15 20:37:40.593442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:48.400 [2024-07-15 20:37:40.593466] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd70d40 00:23:48.400 [2024-07-15 20:37:40.593479] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:48.400 [2024-07-15 20:37:40.595061] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:48.400 [2024-07-15 20:37:40.595090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:48.400 BaseBdev1 00:23:48.400 20:37:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:48.400 20:37:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:48.658 BaseBdev2_malloc 00:23:48.658 20:37:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:48.658 [2024-07-15 20:37:40.963257] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:48.658 [2024-07-15 20:37:40.963303] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:48.658 [2024-07-15 20:37:40.963327] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd71860 00:23:48.658 [2024-07-15 20:37:40.963339] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:48.658 [2024-07-15 20:37:40.964736] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:48.658 [2024-07-15 20:37:40.964763] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:48.658 BaseBdev2 00:23:48.658 20:37:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:48.916 spare_malloc 00:23:48.916 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:49.173 spare_delay 00:23:49.173 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:49.431 [2024-07-15 20:37:41.693772] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:49.431 [2024-07-15 20:37:41.693819] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:49.431 [2024-07-15 20:37:41.693842] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf1fec0 00:23:49.431 [2024-07-15 20:37:41.693854] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:49.431 [2024-07-15 20:37:41.695459] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:49.431 [2024-07-15 20:37:41.695487] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:49.431 spare 00:23:49.431 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:49.689 [2024-07-15 20:37:41.934418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:49.689 [2024-07-15 20:37:41.935739] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:49.689 [2024-07-15 20:37:41.935818] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf21070 00:23:49.689 [2024-07-15 20:37:41.935830] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:49.689 [2024-07-15 20:37:41.936050] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf1a490 00:23:49.689 [2024-07-15 20:37:41.936195] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf21070 00:23:49.689 [2024-07-15 20:37:41.936205] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf21070 00:23:49.689 [2024-07-15 20:37:41.936321] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.689 20:37:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.947 20:37:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:49.947 "name": "raid_bdev1", 00:23:49.947 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:23:49.947 "strip_size_kb": 0, 00:23:49.947 "state": "online", 00:23:49.947 "raid_level": "raid1", 00:23:49.947 "superblock": false, 00:23:49.947 "num_base_bdevs": 2, 00:23:49.947 "num_base_bdevs_discovered": 2, 00:23:49.947 "num_base_bdevs_operational": 2, 00:23:49.947 "base_bdevs_list": [ 00:23:49.947 { 00:23:49.947 "name": "BaseBdev1", 00:23:49.947 "uuid": "03f0532b-7132-5089-b464-28b4c49cc2cc", 00:23:49.947 "is_configured": true, 00:23:49.947 "data_offset": 0, 00:23:49.947 "data_size": 65536 00:23:49.947 }, 00:23:49.947 { 00:23:49.947 "name": "BaseBdev2", 00:23:49.947 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:23:49.947 "is_configured": true, 00:23:49.947 "data_offset": 0, 00:23:49.947 "data_size": 65536 00:23:49.947 } 00:23:49.947 ] 00:23:49.947 }' 00:23:49.947 20:37:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:49.947 20:37:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:50.554 20:37:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:50.554 20:37:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:50.840 [2024-07-15 20:37:43.021536] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:50.840 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:50.840 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.840 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:51.097 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:51.354 [2024-07-15 20:37:43.510626] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf1a490 00:23:51.354 /dev/nbd0 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:51.354 1+0 records in 00:23:51.354 1+0 records out 00:23:51.354 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286003 s, 14.3 MB/s 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:51.354 20:37:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:23:57.907 65536+0 records in 00:23:57.907 65536+0 records out 00:23:57.907 33554432 bytes (34 MB, 32 MiB) copied, 6.16983 s, 5.4 MB/s 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:57.907 20:37:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:57.907 [2024-07-15 20:37:49.949286] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:57.907 [2024-07-15 20:37:50.129822] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.907 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.165 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:58.165 "name": "raid_bdev1", 00:23:58.165 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:23:58.165 "strip_size_kb": 0, 00:23:58.165 "state": "online", 00:23:58.165 "raid_level": "raid1", 00:23:58.165 "superblock": false, 00:23:58.165 "num_base_bdevs": 2, 00:23:58.165 "num_base_bdevs_discovered": 1, 00:23:58.165 "num_base_bdevs_operational": 1, 00:23:58.165 "base_bdevs_list": [ 00:23:58.165 { 00:23:58.165 "name": null, 00:23:58.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.165 "is_configured": false, 00:23:58.165 "data_offset": 0, 00:23:58.165 "data_size": 65536 00:23:58.165 }, 00:23:58.165 { 00:23:58.165 "name": "BaseBdev2", 00:23:58.165 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:23:58.165 "is_configured": true, 00:23:58.165 "data_offset": 0, 00:23:58.165 "data_size": 65536 00:23:58.165 } 00:23:58.165 ] 00:23:58.165 }' 00:23:58.165 20:37:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:58.165 20:37:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:58.731 20:37:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:58.989 [2024-07-15 20:37:51.240763] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:58.989 [2024-07-15 20:37:51.245742] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf21880 00:23:58.989 [2024-07-15 20:37:51.247975] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:58.989 20:37:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:59.924 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:59.924 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:59.924 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:59.924 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:59.924 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:59.924 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.924 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.182 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:00.182 "name": "raid_bdev1", 00:24:00.182 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:24:00.182 "strip_size_kb": 0, 00:24:00.182 "state": "online", 00:24:00.182 "raid_level": "raid1", 00:24:00.182 "superblock": false, 00:24:00.182 "num_base_bdevs": 2, 00:24:00.182 "num_base_bdevs_discovered": 2, 00:24:00.182 "num_base_bdevs_operational": 2, 00:24:00.182 "process": { 00:24:00.182 "type": "rebuild", 00:24:00.182 "target": "spare", 00:24:00.182 "progress": { 00:24:00.182 "blocks": 24576, 00:24:00.182 "percent": 37 00:24:00.182 } 00:24:00.182 }, 00:24:00.182 "base_bdevs_list": [ 00:24:00.182 { 00:24:00.182 "name": "spare", 00:24:00.182 "uuid": "f0506a51-d084-595e-a810-647ff7345c85", 00:24:00.182 "is_configured": true, 00:24:00.182 "data_offset": 0, 00:24:00.182 "data_size": 65536 00:24:00.182 }, 00:24:00.182 { 00:24:00.182 "name": "BaseBdev2", 00:24:00.182 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:24:00.182 "is_configured": true, 00:24:00.182 "data_offset": 0, 00:24:00.182 "data_size": 65536 00:24:00.182 } 00:24:00.182 ] 00:24:00.182 }' 00:24:00.182 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:00.451 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:00.451 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:00.451 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:00.451 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:00.709 [2024-07-15 20:37:52.835281] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:00.709 [2024-07-15 20:37:52.860364] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:00.709 [2024-07-15 20:37:52.860411] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:00.709 [2024-07-15 20:37:52.860426] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:00.709 [2024-07-15 20:37:52.860435] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.709 20:37:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.966 20:37:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:00.966 "name": "raid_bdev1", 00:24:00.966 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:24:00.966 "strip_size_kb": 0, 00:24:00.966 "state": "online", 00:24:00.966 "raid_level": "raid1", 00:24:00.966 "superblock": false, 00:24:00.966 "num_base_bdevs": 2, 00:24:00.966 "num_base_bdevs_discovered": 1, 00:24:00.966 "num_base_bdevs_operational": 1, 00:24:00.966 "base_bdevs_list": [ 00:24:00.966 { 00:24:00.966 "name": null, 00:24:00.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:00.966 "is_configured": false, 00:24:00.966 "data_offset": 0, 00:24:00.966 "data_size": 65536 00:24:00.966 }, 00:24:00.966 { 00:24:00.966 "name": "BaseBdev2", 00:24:00.966 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:24:00.966 "is_configured": true, 00:24:00.966 "data_offset": 0, 00:24:00.966 "data_size": 65536 00:24:00.966 } 00:24:00.966 ] 00:24:00.966 }' 00:24:00.966 20:37:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:00.966 20:37:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:01.532 20:37:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:01.532 20:37:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:01.532 20:37:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:01.532 20:37:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:01.532 20:37:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:01.532 20:37:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.532 20:37:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.789 20:37:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:01.789 "name": "raid_bdev1", 00:24:01.789 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:24:01.789 "strip_size_kb": 0, 00:24:01.789 "state": "online", 00:24:01.789 "raid_level": "raid1", 00:24:01.789 "superblock": false, 00:24:01.789 "num_base_bdevs": 2, 00:24:01.789 "num_base_bdevs_discovered": 1, 00:24:01.789 "num_base_bdevs_operational": 1, 00:24:01.789 "base_bdevs_list": [ 00:24:01.789 { 00:24:01.789 "name": null, 00:24:01.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.789 "is_configured": false, 00:24:01.789 "data_offset": 0, 00:24:01.789 "data_size": 65536 00:24:01.789 }, 00:24:01.789 { 00:24:01.789 "name": "BaseBdev2", 00:24:01.789 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:24:01.789 "is_configured": true, 00:24:01.789 "data_offset": 0, 00:24:01.789 "data_size": 65536 00:24:01.789 } 00:24:01.789 ] 00:24:01.789 }' 00:24:01.789 20:37:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:01.789 20:37:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:01.789 20:37:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:01.789 20:37:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:01.789 20:37:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:02.047 [2024-07-15 20:37:54.321423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:02.047 [2024-07-15 20:37:54.326345] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf1a490 00:24:02.047 [2024-07-15 20:37:54.327812] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:02.047 20:37:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:02.979 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:02.979 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:02.979 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:02.979 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:02.979 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:02.979 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.979 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.544 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:03.544 "name": "raid_bdev1", 00:24:03.544 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:24:03.544 "strip_size_kb": 0, 00:24:03.544 "state": "online", 00:24:03.544 "raid_level": "raid1", 00:24:03.544 "superblock": false, 00:24:03.544 "num_base_bdevs": 2, 00:24:03.544 "num_base_bdevs_discovered": 2, 00:24:03.544 "num_base_bdevs_operational": 2, 00:24:03.544 "process": { 00:24:03.544 "type": "rebuild", 00:24:03.544 "target": "spare", 00:24:03.544 "progress": { 00:24:03.544 "blocks": 24576, 00:24:03.544 "percent": 37 00:24:03.544 } 00:24:03.544 }, 00:24:03.544 "base_bdevs_list": [ 00:24:03.544 { 00:24:03.544 "name": "spare", 00:24:03.544 "uuid": "f0506a51-d084-595e-a810-647ff7345c85", 00:24:03.544 "is_configured": true, 00:24:03.544 "data_offset": 0, 00:24:03.545 "data_size": 65536 00:24:03.545 }, 00:24:03.545 { 00:24:03.545 "name": "BaseBdev2", 00:24:03.545 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:24:03.545 "is_configured": true, 00:24:03.545 "data_offset": 0, 00:24:03.545 "data_size": 65536 00:24:03.545 } 00:24:03.545 ] 00:24:03.545 }' 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=813 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:03.545 "name": "raid_bdev1", 00:24:03.545 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:24:03.545 "strip_size_kb": 0, 00:24:03.545 "state": "online", 00:24:03.545 "raid_level": "raid1", 00:24:03.545 "superblock": false, 00:24:03.545 "num_base_bdevs": 2, 00:24:03.545 "num_base_bdevs_discovered": 2, 00:24:03.545 "num_base_bdevs_operational": 2, 00:24:03.545 "process": { 00:24:03.545 "type": "rebuild", 00:24:03.545 "target": "spare", 00:24:03.545 "progress": { 00:24:03.545 "blocks": 30720, 00:24:03.545 "percent": 46 00:24:03.545 } 00:24:03.545 }, 00:24:03.545 "base_bdevs_list": [ 00:24:03.545 { 00:24:03.545 "name": "spare", 00:24:03.545 "uuid": "f0506a51-d084-595e-a810-647ff7345c85", 00:24:03.545 "is_configured": true, 00:24:03.545 "data_offset": 0, 00:24:03.545 "data_size": 65536 00:24:03.545 }, 00:24:03.545 { 00:24:03.545 "name": "BaseBdev2", 00:24:03.545 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:24:03.545 "is_configured": true, 00:24:03.545 "data_offset": 0, 00:24:03.545 "data_size": 65536 00:24:03.545 } 00:24:03.545 ] 00:24:03.545 }' 00:24:03.545 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:03.803 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:03.803 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:03.803 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:03.804 20:37:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:04.739 20:37:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:04.739 20:37:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:04.739 20:37:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:04.739 20:37:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:04.739 20:37:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:04.739 20:37:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:04.739 20:37:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.739 20:37:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.998 20:37:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:04.998 "name": "raid_bdev1", 00:24:04.998 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:24:04.998 "strip_size_kb": 0, 00:24:04.998 "state": "online", 00:24:04.998 "raid_level": "raid1", 00:24:04.998 "superblock": false, 00:24:04.998 "num_base_bdevs": 2, 00:24:04.998 "num_base_bdevs_discovered": 2, 00:24:04.998 "num_base_bdevs_operational": 2, 00:24:04.998 "process": { 00:24:04.998 "type": "rebuild", 00:24:04.998 "target": "spare", 00:24:04.998 "progress": { 00:24:04.998 "blocks": 57344, 00:24:04.998 "percent": 87 00:24:04.998 } 00:24:04.998 }, 00:24:04.998 "base_bdevs_list": [ 00:24:04.998 { 00:24:04.998 "name": "spare", 00:24:04.998 "uuid": "f0506a51-d084-595e-a810-647ff7345c85", 00:24:04.998 "is_configured": true, 00:24:04.998 "data_offset": 0, 00:24:04.998 "data_size": 65536 00:24:04.998 }, 00:24:04.998 { 00:24:04.998 "name": "BaseBdev2", 00:24:04.998 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:24:04.998 "is_configured": true, 00:24:04.998 "data_offset": 0, 00:24:04.998 "data_size": 65536 00:24:04.998 } 00:24:04.998 ] 00:24:04.998 }' 00:24:04.998 20:37:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:04.998 20:37:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:04.998 20:37:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:04.998 20:37:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:04.998 20:37:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:05.256 [2024-07-15 20:37:57.552985] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:05.256 [2024-07-15 20:37:57.553051] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:05.256 [2024-07-15 20:37:57.553089] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:06.191 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:06.191 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:06.191 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:06.191 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:06.191 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:06.191 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:06.191 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.191 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.449 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:06.449 "name": "raid_bdev1", 00:24:06.449 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:24:06.449 "strip_size_kb": 0, 00:24:06.449 "state": "online", 00:24:06.449 "raid_level": "raid1", 00:24:06.449 "superblock": false, 00:24:06.449 "num_base_bdevs": 2, 00:24:06.449 "num_base_bdevs_discovered": 2, 00:24:06.449 "num_base_bdevs_operational": 2, 00:24:06.449 "base_bdevs_list": [ 00:24:06.449 { 00:24:06.449 "name": "spare", 00:24:06.449 "uuid": "f0506a51-d084-595e-a810-647ff7345c85", 00:24:06.449 "is_configured": true, 00:24:06.450 "data_offset": 0, 00:24:06.450 "data_size": 65536 00:24:06.450 }, 00:24:06.450 { 00:24:06.450 "name": "BaseBdev2", 00:24:06.450 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:24:06.450 "is_configured": true, 00:24:06.450 "data_offset": 0, 00:24:06.450 "data_size": 65536 00:24:06.450 } 00:24:06.450 ] 00:24:06.450 }' 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.450 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:06.708 "name": "raid_bdev1", 00:24:06.708 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:24:06.708 "strip_size_kb": 0, 00:24:06.708 "state": "online", 00:24:06.708 "raid_level": "raid1", 00:24:06.708 "superblock": false, 00:24:06.708 "num_base_bdevs": 2, 00:24:06.708 "num_base_bdevs_discovered": 2, 00:24:06.708 "num_base_bdevs_operational": 2, 00:24:06.708 "base_bdevs_list": [ 00:24:06.708 { 00:24:06.708 "name": "spare", 00:24:06.708 "uuid": "f0506a51-d084-595e-a810-647ff7345c85", 00:24:06.708 "is_configured": true, 00:24:06.708 "data_offset": 0, 00:24:06.708 "data_size": 65536 00:24:06.708 }, 00:24:06.708 { 00:24:06.708 "name": "BaseBdev2", 00:24:06.708 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:24:06.708 "is_configured": true, 00:24:06.708 "data_offset": 0, 00:24:06.708 "data_size": 65536 00:24:06.708 } 00:24:06.708 ] 00:24:06.708 }' 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.708 20:37:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.966 20:37:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:06.966 "name": "raid_bdev1", 00:24:06.966 "uuid": "3dd4a87c-306f-4f61-b31c-0acba1403e14", 00:24:06.966 "strip_size_kb": 0, 00:24:06.966 "state": "online", 00:24:06.966 "raid_level": "raid1", 00:24:06.966 "superblock": false, 00:24:06.966 "num_base_bdevs": 2, 00:24:06.966 "num_base_bdevs_discovered": 2, 00:24:06.966 "num_base_bdevs_operational": 2, 00:24:06.966 "base_bdevs_list": [ 00:24:06.966 { 00:24:06.966 "name": "spare", 00:24:06.966 "uuid": "f0506a51-d084-595e-a810-647ff7345c85", 00:24:06.966 "is_configured": true, 00:24:06.966 "data_offset": 0, 00:24:06.966 "data_size": 65536 00:24:06.966 }, 00:24:06.966 { 00:24:06.966 "name": "BaseBdev2", 00:24:06.966 "uuid": "ba9c2eeb-8a15-5af8-a657-1d5cde7265e7", 00:24:06.966 "is_configured": true, 00:24:06.966 "data_offset": 0, 00:24:06.966 "data_size": 65536 00:24:06.966 } 00:24:06.966 ] 00:24:06.966 }' 00:24:06.966 20:37:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:06.966 20:37:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:07.533 20:37:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:07.792 [2024-07-15 20:38:00.032787] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:07.792 [2024-07-15 20:38:00.032849] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:07.792 [2024-07-15 20:38:00.033038] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:07.792 [2024-07-15 20:38:00.033172] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:07.792 [2024-07-15 20:38:00.033216] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf21070 name raid_bdev1, state offline 00:24:07.792 20:38:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:24:07.792 20:38:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:08.050 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:08.309 /dev/nbd0 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:08.309 1+0 records in 00:24:08.309 1+0 records out 00:24:08.309 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234857 s, 17.4 MB/s 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:08.309 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:08.567 /dev/nbd1 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:08.567 1+0 records in 00:24:08.567 1+0 records out 00:24:08.567 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000301338 s, 13.6 MB/s 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:08.567 20:38:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:08.826 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:08.826 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:08.826 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:08.826 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:08.826 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:08.826 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:08.826 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:08.826 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:08.826 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:08.826 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1460698 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 1460698 ']' 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 1460698 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:09.084 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1460698 00:24:09.360 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:09.360 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:09.360 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1460698' 00:24:09.360 killing process with pid 1460698 00:24:09.360 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 1460698 00:24:09.360 Received shutdown signal, test time was about 60.000000 seconds 00:24:09.360 00:24:09.360 Latency(us) 00:24:09.360 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:09.360 =================================================================================================================== 00:24:09.360 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:09.360 [2024-07-15 20:38:01.488445] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:09.360 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 1460698 00:24:09.360 [2024-07-15 20:38:01.515116] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:24:09.630 00:24:09.630 real 0m22.508s 00:24:09.630 user 0m29.441s 00:24:09.630 sys 0m5.317s 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:09.630 ************************************ 00:24:09.630 END TEST raid_rebuild_test 00:24:09.630 ************************************ 00:24:09.630 20:38:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:09.630 20:38:01 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:24:09.630 20:38:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:09.630 20:38:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:09.630 20:38:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:09.630 ************************************ 00:24:09.630 START TEST raid_rebuild_test_sb 00:24:09.630 ************************************ 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1463748 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1463748 /var/tmp/spdk-raid.sock 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1463748 ']' 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:09.630 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:09.630 20:38:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:09.630 [2024-07-15 20:38:01.899946] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:24:09.630 [2024-07-15 20:38:01.900023] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1463748 ] 00:24:09.630 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:09.630 Zero copy mechanism will not be used. 00:24:09.899 [2024-07-15 20:38:02.031305] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:09.899 [2024-07-15 20:38:02.132808] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:09.899 [2024-07-15 20:38:02.194157] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:09.899 [2024-07-15 20:38:02.194195] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:10.832 20:38:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:10.832 20:38:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:24:10.832 20:38:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:10.832 20:38:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:11.089 BaseBdev1_malloc 00:24:11.089 20:38:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:11.347 [2024-07-15 20:38:03.574577] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:11.347 [2024-07-15 20:38:03.574630] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:11.347 [2024-07-15 20:38:03.574654] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bbcd40 00:24:11.347 [2024-07-15 20:38:03.574667] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:11.347 [2024-07-15 20:38:03.576272] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:11.347 [2024-07-15 20:38:03.576303] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:11.347 BaseBdev1 00:24:11.347 20:38:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:11.347 20:38:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:11.604 BaseBdev2_malloc 00:24:11.604 20:38:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:11.862 [2024-07-15 20:38:04.076840] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:11.862 [2024-07-15 20:38:04.076889] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:11.862 [2024-07-15 20:38:04.076913] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bbd860 00:24:11.862 [2024-07-15 20:38:04.076930] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:11.862 [2024-07-15 20:38:04.078319] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:11.862 [2024-07-15 20:38:04.078345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:11.862 BaseBdev2 00:24:11.862 20:38:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:12.120 spare_malloc 00:24:12.120 20:38:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:12.378 spare_delay 00:24:12.378 20:38:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:12.636 [2024-07-15 20:38:04.819309] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:12.636 [2024-07-15 20:38:04.819352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:12.636 [2024-07-15 20:38:04.819374] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d6bec0 00:24:12.636 [2024-07-15 20:38:04.819386] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:12.636 [2024-07-15 20:38:04.820856] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:12.636 [2024-07-15 20:38:04.820884] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:12.636 spare 00:24:12.636 20:38:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:12.894 [2024-07-15 20:38:05.068000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:12.894 [2024-07-15 20:38:05.069182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:12.894 [2024-07-15 20:38:05.069344] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d6d070 00:24:12.894 [2024-07-15 20:38:05.069357] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:12.894 [2024-07-15 20:38:05.069542] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d66490 00:24:12.894 [2024-07-15 20:38:05.069680] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d6d070 00:24:12.894 [2024-07-15 20:38:05.069690] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d6d070 00:24:12.894 [2024-07-15 20:38:05.069782] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.894 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.151 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:13.151 "name": "raid_bdev1", 00:24:13.151 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:13.151 "strip_size_kb": 0, 00:24:13.152 "state": "online", 00:24:13.152 "raid_level": "raid1", 00:24:13.152 "superblock": true, 00:24:13.152 "num_base_bdevs": 2, 00:24:13.152 "num_base_bdevs_discovered": 2, 00:24:13.152 "num_base_bdevs_operational": 2, 00:24:13.152 "base_bdevs_list": [ 00:24:13.152 { 00:24:13.152 "name": "BaseBdev1", 00:24:13.152 "uuid": "6bcfe2c9-8cbf-5049-9343-515f302380cd", 00:24:13.152 "is_configured": true, 00:24:13.152 "data_offset": 2048, 00:24:13.152 "data_size": 63488 00:24:13.152 }, 00:24:13.152 { 00:24:13.152 "name": "BaseBdev2", 00:24:13.152 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:13.152 "is_configured": true, 00:24:13.152 "data_offset": 2048, 00:24:13.152 "data_size": 63488 00:24:13.152 } 00:24:13.152 ] 00:24:13.152 }' 00:24:13.152 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:13.152 20:38:05 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:13.718 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:13.718 20:38:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:13.718 [2024-07-15 20:38:06.075009] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:13.976 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:13.976 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.976 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:14.234 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:14.234 [2024-07-15 20:38:06.588191] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d66490 00:24:14.234 /dev/nbd0 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:14.493 1+0 records in 00:24:14.493 1+0 records out 00:24:14.493 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275774 s, 14.9 MB/s 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:14.493 20:38:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:24:21.044 63488+0 records in 00:24:21.044 63488+0 records out 00:24:21.044 32505856 bytes (33 MB, 31 MiB) copied, 6.08825 s, 5.3 MB/s 00:24:21.044 20:38:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:21.044 20:38:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:21.044 20:38:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:21.044 20:38:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:21.044 20:38:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:21.044 20:38:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:21.045 20:38:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:21.045 [2024-07-15 20:38:13.015800] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:21.045 [2024-07-15 20:38:13.248474] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.045 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.302 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:21.302 "name": "raid_bdev1", 00:24:21.302 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:21.302 "strip_size_kb": 0, 00:24:21.302 "state": "online", 00:24:21.302 "raid_level": "raid1", 00:24:21.302 "superblock": true, 00:24:21.302 "num_base_bdevs": 2, 00:24:21.302 "num_base_bdevs_discovered": 1, 00:24:21.302 "num_base_bdevs_operational": 1, 00:24:21.302 "base_bdevs_list": [ 00:24:21.302 { 00:24:21.302 "name": null, 00:24:21.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:21.302 "is_configured": false, 00:24:21.302 "data_offset": 2048, 00:24:21.302 "data_size": 63488 00:24:21.302 }, 00:24:21.302 { 00:24:21.302 "name": "BaseBdev2", 00:24:21.302 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:21.302 "is_configured": true, 00:24:21.302 "data_offset": 2048, 00:24:21.302 "data_size": 63488 00:24:21.302 } 00:24:21.302 ] 00:24:21.302 }' 00:24:21.302 20:38:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:21.302 20:38:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:21.868 20:38:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:22.126 [2024-07-15 20:38:14.351404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:22.126 [2024-07-15 20:38:14.356399] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d6cce0 00:24:22.126 [2024-07-15 20:38:14.358611] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:22.126 20:38:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:23.060 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:23.060 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:23.060 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:23.060 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:23.060 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:23.060 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.060 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.318 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:23.318 "name": "raid_bdev1", 00:24:23.318 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:23.318 "strip_size_kb": 0, 00:24:23.318 "state": "online", 00:24:23.318 "raid_level": "raid1", 00:24:23.318 "superblock": true, 00:24:23.318 "num_base_bdevs": 2, 00:24:23.318 "num_base_bdevs_discovered": 2, 00:24:23.318 "num_base_bdevs_operational": 2, 00:24:23.318 "process": { 00:24:23.318 "type": "rebuild", 00:24:23.318 "target": "spare", 00:24:23.318 "progress": { 00:24:23.318 "blocks": 24576, 00:24:23.318 "percent": 38 00:24:23.318 } 00:24:23.318 }, 00:24:23.318 "base_bdevs_list": [ 00:24:23.318 { 00:24:23.318 "name": "spare", 00:24:23.318 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:23.318 "is_configured": true, 00:24:23.318 "data_offset": 2048, 00:24:23.318 "data_size": 63488 00:24:23.318 }, 00:24:23.318 { 00:24:23.318 "name": "BaseBdev2", 00:24:23.318 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:23.318 "is_configured": true, 00:24:23.318 "data_offset": 2048, 00:24:23.318 "data_size": 63488 00:24:23.318 } 00:24:23.318 ] 00:24:23.318 }' 00:24:23.318 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:23.318 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:23.318 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:23.576 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:23.576 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:23.835 [2024-07-15 20:38:15.957936] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:23.835 [2024-07-15 20:38:15.970825] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:23.835 [2024-07-15 20:38:15.970871] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:23.835 [2024-07-15 20:38:15.970887] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:23.835 [2024-07-15 20:38:15.970896] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:23.835 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:23.835 20:38:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:23.835 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:23.835 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:23.835 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:23.835 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:23.835 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:23.835 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:23.835 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:23.835 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:23.835 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.835 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.094 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:24.094 "name": "raid_bdev1", 00:24:24.094 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:24.094 "strip_size_kb": 0, 00:24:24.094 "state": "online", 00:24:24.094 "raid_level": "raid1", 00:24:24.094 "superblock": true, 00:24:24.094 "num_base_bdevs": 2, 00:24:24.094 "num_base_bdevs_discovered": 1, 00:24:24.094 "num_base_bdevs_operational": 1, 00:24:24.094 "base_bdevs_list": [ 00:24:24.094 { 00:24:24.094 "name": null, 00:24:24.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.094 "is_configured": false, 00:24:24.094 "data_offset": 2048, 00:24:24.094 "data_size": 63488 00:24:24.094 }, 00:24:24.094 { 00:24:24.094 "name": "BaseBdev2", 00:24:24.094 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:24.094 "is_configured": true, 00:24:24.094 "data_offset": 2048, 00:24:24.094 "data_size": 63488 00:24:24.094 } 00:24:24.094 ] 00:24:24.094 }' 00:24:24.094 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:24.094 20:38:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:24.660 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:24.660 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:24.660 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:24.660 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:24.660 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:24.660 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.660 20:38:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.918 20:38:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:24.918 "name": "raid_bdev1", 00:24:24.918 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:24.918 "strip_size_kb": 0, 00:24:24.918 "state": "online", 00:24:24.918 "raid_level": "raid1", 00:24:24.918 "superblock": true, 00:24:24.918 "num_base_bdevs": 2, 00:24:24.918 "num_base_bdevs_discovered": 1, 00:24:24.918 "num_base_bdevs_operational": 1, 00:24:24.918 "base_bdevs_list": [ 00:24:24.918 { 00:24:24.918 "name": null, 00:24:24.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.918 "is_configured": false, 00:24:24.918 "data_offset": 2048, 00:24:24.918 "data_size": 63488 00:24:24.918 }, 00:24:24.918 { 00:24:24.918 "name": "BaseBdev2", 00:24:24.918 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:24.918 "is_configured": true, 00:24:24.918 "data_offset": 2048, 00:24:24.918 "data_size": 63488 00:24:24.918 } 00:24:24.918 ] 00:24:24.918 }' 00:24:24.918 20:38:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:24.918 20:38:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:24.918 20:38:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:24.918 20:38:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:24.918 20:38:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:25.176 [2024-07-15 20:38:17.439122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:25.176 [2024-07-15 20:38:17.444818] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d6cce0 00:24:25.176 [2024-07-15 20:38:17.446338] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:25.176 20:38:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:26.109 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:26.109 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:26.109 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:26.109 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:26.109 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:26.109 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.109 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.367 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:26.367 "name": "raid_bdev1", 00:24:26.367 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:26.367 "strip_size_kb": 0, 00:24:26.367 "state": "online", 00:24:26.367 "raid_level": "raid1", 00:24:26.367 "superblock": true, 00:24:26.367 "num_base_bdevs": 2, 00:24:26.367 "num_base_bdevs_discovered": 2, 00:24:26.367 "num_base_bdevs_operational": 2, 00:24:26.367 "process": { 00:24:26.367 "type": "rebuild", 00:24:26.367 "target": "spare", 00:24:26.367 "progress": { 00:24:26.367 "blocks": 24576, 00:24:26.367 "percent": 38 00:24:26.367 } 00:24:26.367 }, 00:24:26.367 "base_bdevs_list": [ 00:24:26.367 { 00:24:26.367 "name": "spare", 00:24:26.367 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:26.367 "is_configured": true, 00:24:26.367 "data_offset": 2048, 00:24:26.367 "data_size": 63488 00:24:26.367 }, 00:24:26.367 { 00:24:26.367 "name": "BaseBdev2", 00:24:26.367 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:26.367 "is_configured": true, 00:24:26.367 "data_offset": 2048, 00:24:26.367 "data_size": 63488 00:24:26.367 } 00:24:26.367 ] 00:24:26.367 }' 00:24:26.367 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:26.625 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=836 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.625 20:38:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.883 20:38:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:26.883 "name": "raid_bdev1", 00:24:26.883 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:26.883 "strip_size_kb": 0, 00:24:26.883 "state": "online", 00:24:26.883 "raid_level": "raid1", 00:24:26.883 "superblock": true, 00:24:26.883 "num_base_bdevs": 2, 00:24:26.883 "num_base_bdevs_discovered": 2, 00:24:26.883 "num_base_bdevs_operational": 2, 00:24:26.883 "process": { 00:24:26.883 "type": "rebuild", 00:24:26.883 "target": "spare", 00:24:26.883 "progress": { 00:24:26.883 "blocks": 30720, 00:24:26.883 "percent": 48 00:24:26.883 } 00:24:26.883 }, 00:24:26.883 "base_bdevs_list": [ 00:24:26.883 { 00:24:26.883 "name": "spare", 00:24:26.883 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:26.883 "is_configured": true, 00:24:26.883 "data_offset": 2048, 00:24:26.883 "data_size": 63488 00:24:26.883 }, 00:24:26.883 { 00:24:26.883 "name": "BaseBdev2", 00:24:26.883 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:26.883 "is_configured": true, 00:24:26.883 "data_offset": 2048, 00:24:26.883 "data_size": 63488 00:24:26.883 } 00:24:26.883 ] 00:24:26.883 }' 00:24:26.883 20:38:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:26.883 20:38:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:26.883 20:38:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:26.883 20:38:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:26.883 20:38:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:27.889 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:27.889 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:27.889 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.889 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:27.889 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:27.889 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.889 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.889 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.147 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:28.148 "name": "raid_bdev1", 00:24:28.148 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:28.148 "strip_size_kb": 0, 00:24:28.148 "state": "online", 00:24:28.148 "raid_level": "raid1", 00:24:28.148 "superblock": true, 00:24:28.148 "num_base_bdevs": 2, 00:24:28.148 "num_base_bdevs_discovered": 2, 00:24:28.148 "num_base_bdevs_operational": 2, 00:24:28.148 "process": { 00:24:28.148 "type": "rebuild", 00:24:28.148 "target": "spare", 00:24:28.148 "progress": { 00:24:28.148 "blocks": 59392, 00:24:28.148 "percent": 93 00:24:28.148 } 00:24:28.148 }, 00:24:28.148 "base_bdevs_list": [ 00:24:28.148 { 00:24:28.148 "name": "spare", 00:24:28.148 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:28.148 "is_configured": true, 00:24:28.148 "data_offset": 2048, 00:24:28.148 "data_size": 63488 00:24:28.148 }, 00:24:28.148 { 00:24:28.148 "name": "BaseBdev2", 00:24:28.148 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:28.148 "is_configured": true, 00:24:28.148 "data_offset": 2048, 00:24:28.148 "data_size": 63488 00:24:28.148 } 00:24:28.148 ] 00:24:28.148 }' 00:24:28.148 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:28.148 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:28.148 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:28.148 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:28.148 20:38:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:28.405 [2024-07-15 20:38:20.571422] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:28.405 [2024-07-15 20:38:20.571488] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:28.405 [2024-07-15 20:38:20.571571] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:29.339 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:29.339 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:29.339 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:29.339 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:29.339 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:29.339 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:29.339 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.339 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:29.597 "name": "raid_bdev1", 00:24:29.597 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:29.597 "strip_size_kb": 0, 00:24:29.597 "state": "online", 00:24:29.597 "raid_level": "raid1", 00:24:29.597 "superblock": true, 00:24:29.597 "num_base_bdevs": 2, 00:24:29.597 "num_base_bdevs_discovered": 2, 00:24:29.597 "num_base_bdevs_operational": 2, 00:24:29.597 "base_bdevs_list": [ 00:24:29.597 { 00:24:29.597 "name": "spare", 00:24:29.597 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:29.597 "is_configured": true, 00:24:29.597 "data_offset": 2048, 00:24:29.597 "data_size": 63488 00:24:29.597 }, 00:24:29.597 { 00:24:29.597 "name": "BaseBdev2", 00:24:29.597 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:29.597 "is_configured": true, 00:24:29.597 "data_offset": 2048, 00:24:29.597 "data_size": 63488 00:24:29.597 } 00:24:29.597 ] 00:24:29.597 }' 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.597 20:38:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:29.855 "name": "raid_bdev1", 00:24:29.855 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:29.855 "strip_size_kb": 0, 00:24:29.855 "state": "online", 00:24:29.855 "raid_level": "raid1", 00:24:29.855 "superblock": true, 00:24:29.855 "num_base_bdevs": 2, 00:24:29.855 "num_base_bdevs_discovered": 2, 00:24:29.855 "num_base_bdevs_operational": 2, 00:24:29.855 "base_bdevs_list": [ 00:24:29.855 { 00:24:29.855 "name": "spare", 00:24:29.855 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:29.855 "is_configured": true, 00:24:29.855 "data_offset": 2048, 00:24:29.855 "data_size": 63488 00:24:29.855 }, 00:24:29.855 { 00:24:29.855 "name": "BaseBdev2", 00:24:29.855 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:29.855 "is_configured": true, 00:24:29.855 "data_offset": 2048, 00:24:29.855 "data_size": 63488 00:24:29.855 } 00:24:29.855 ] 00:24:29.855 }' 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.855 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.113 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:30.113 "name": "raid_bdev1", 00:24:30.113 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:30.113 "strip_size_kb": 0, 00:24:30.113 "state": "online", 00:24:30.113 "raid_level": "raid1", 00:24:30.113 "superblock": true, 00:24:30.113 "num_base_bdevs": 2, 00:24:30.113 "num_base_bdevs_discovered": 2, 00:24:30.113 "num_base_bdevs_operational": 2, 00:24:30.113 "base_bdevs_list": [ 00:24:30.113 { 00:24:30.113 "name": "spare", 00:24:30.113 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:30.113 "is_configured": true, 00:24:30.113 "data_offset": 2048, 00:24:30.113 "data_size": 63488 00:24:30.113 }, 00:24:30.113 { 00:24:30.113 "name": "BaseBdev2", 00:24:30.113 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:30.113 "is_configured": true, 00:24:30.113 "data_offset": 2048, 00:24:30.113 "data_size": 63488 00:24:30.113 } 00:24:30.113 ] 00:24:30.113 }' 00:24:30.113 20:38:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:30.113 20:38:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:31.055 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:31.055 [2024-07-15 20:38:23.308058] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:31.055 [2024-07-15 20:38:23.308088] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:31.055 [2024-07-15 20:38:23.308151] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:31.055 [2024-07-15 20:38:23.308207] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:31.055 [2024-07-15 20:38:23.308219] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d6d070 name raid_bdev1, state offline 00:24:31.055 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:24:31.055 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:31.313 20:38:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:31.880 /dev/nbd0 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:31.880 1+0 records in 00:24:31.880 1+0 records out 00:24:31.880 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265414 s, 15.4 MB/s 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:31.880 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:32.138 /dev/nbd1 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:32.138 1+0 records in 00:24:32.138 1+0 records out 00:24:32.138 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000372683 s, 11.0 MB/s 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:32.138 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:32.397 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:32.397 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:32.397 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:32.397 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:32.397 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:32.397 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:32.397 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:32.655 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:32.655 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:32.655 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:32.655 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:32.655 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:32.655 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:32.655 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:32.655 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:32.655 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:32.655 20:38:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:32.655 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:32.913 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:32.913 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:32.913 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:32.913 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:32.913 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:32.913 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:32.913 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:32.913 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:32.913 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:32.913 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:33.172 [2024-07-15 20:38:25.448125] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:33.172 [2024-07-15 20:38:25.448178] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:33.172 [2024-07-15 20:38:25.448201] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d6c500 00:24:33.172 [2024-07-15 20:38:25.448214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:33.172 [2024-07-15 20:38:25.449858] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:33.172 [2024-07-15 20:38:25.449889] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:33.172 [2024-07-15 20:38:25.449987] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:33.172 [2024-07-15 20:38:25.450015] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:33.172 [2024-07-15 20:38:25.450118] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:33.172 spare 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.172 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.172 [2024-07-15 20:38:25.550433] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d6b260 00:24:33.172 [2024-07-15 20:38:25.550453] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:33.172 [2024-07-15 20:38:25.550658] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d66490 00:24:33.172 [2024-07-15 20:38:25.550813] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d6b260 00:24:33.172 [2024-07-15 20:38:25.550823] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d6b260 00:24:33.430 [2024-07-15 20:38:25.550945] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:33.430 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.430 "name": "raid_bdev1", 00:24:33.430 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:33.430 "strip_size_kb": 0, 00:24:33.430 "state": "online", 00:24:33.430 "raid_level": "raid1", 00:24:33.430 "superblock": true, 00:24:33.430 "num_base_bdevs": 2, 00:24:33.430 "num_base_bdevs_discovered": 2, 00:24:33.430 "num_base_bdevs_operational": 2, 00:24:33.430 "base_bdevs_list": [ 00:24:33.430 { 00:24:33.430 "name": "spare", 00:24:33.430 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:33.430 "is_configured": true, 00:24:33.430 "data_offset": 2048, 00:24:33.430 "data_size": 63488 00:24:33.430 }, 00:24:33.430 { 00:24:33.430 "name": "BaseBdev2", 00:24:33.430 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:33.430 "is_configured": true, 00:24:33.430 "data_offset": 2048, 00:24:33.430 "data_size": 63488 00:24:33.430 } 00:24:33.430 ] 00:24:33.430 }' 00:24:33.430 20:38:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.430 20:38:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:33.997 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:33.997 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.997 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:33.997 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:33.997 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.997 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.997 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.256 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:34.256 "name": "raid_bdev1", 00:24:34.256 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:34.256 "strip_size_kb": 0, 00:24:34.256 "state": "online", 00:24:34.256 "raid_level": "raid1", 00:24:34.256 "superblock": true, 00:24:34.256 "num_base_bdevs": 2, 00:24:34.256 "num_base_bdevs_discovered": 2, 00:24:34.256 "num_base_bdevs_operational": 2, 00:24:34.256 "base_bdevs_list": [ 00:24:34.256 { 00:24:34.256 "name": "spare", 00:24:34.256 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:34.256 "is_configured": true, 00:24:34.256 "data_offset": 2048, 00:24:34.256 "data_size": 63488 00:24:34.256 }, 00:24:34.256 { 00:24:34.256 "name": "BaseBdev2", 00:24:34.256 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:34.256 "is_configured": true, 00:24:34.256 "data_offset": 2048, 00:24:34.256 "data_size": 63488 00:24:34.256 } 00:24:34.256 ] 00:24:34.256 }' 00:24:34.256 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:34.256 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:34.256 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:34.256 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:34.256 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.256 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:34.514 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:34.514 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:34.772 [2024-07-15 20:38:26.972284] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.772 20:38:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.031 20:38:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:35.031 "name": "raid_bdev1", 00:24:35.031 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:35.031 "strip_size_kb": 0, 00:24:35.031 "state": "online", 00:24:35.031 "raid_level": "raid1", 00:24:35.031 "superblock": true, 00:24:35.031 "num_base_bdevs": 2, 00:24:35.031 "num_base_bdevs_discovered": 1, 00:24:35.031 "num_base_bdevs_operational": 1, 00:24:35.031 "base_bdevs_list": [ 00:24:35.031 { 00:24:35.031 "name": null, 00:24:35.031 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:35.031 "is_configured": false, 00:24:35.031 "data_offset": 2048, 00:24:35.031 "data_size": 63488 00:24:35.031 }, 00:24:35.031 { 00:24:35.031 "name": "BaseBdev2", 00:24:35.031 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:35.031 "is_configured": true, 00:24:35.031 "data_offset": 2048, 00:24:35.031 "data_size": 63488 00:24:35.031 } 00:24:35.031 ] 00:24:35.031 }' 00:24:35.031 20:38:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:35.031 20:38:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:35.597 20:38:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:35.854 [2024-07-15 20:38:28.051153] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:35.854 [2024-07-15 20:38:28.051309] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:35.854 [2024-07-15 20:38:28.051328] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:35.854 [2024-07-15 20:38:28.051356] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:35.854 [2024-07-15 20:38:28.056150] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d66490 00:24:35.854 [2024-07-15 20:38:28.058475] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:35.854 20:38:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:36.790 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:36.790 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:36.790 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:36.790 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:36.790 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:36.790 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.790 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:37.049 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:37.049 "name": "raid_bdev1", 00:24:37.049 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:37.049 "strip_size_kb": 0, 00:24:37.049 "state": "online", 00:24:37.049 "raid_level": "raid1", 00:24:37.049 "superblock": true, 00:24:37.049 "num_base_bdevs": 2, 00:24:37.049 "num_base_bdevs_discovered": 2, 00:24:37.049 "num_base_bdevs_operational": 2, 00:24:37.049 "process": { 00:24:37.049 "type": "rebuild", 00:24:37.049 "target": "spare", 00:24:37.049 "progress": { 00:24:37.049 "blocks": 24576, 00:24:37.049 "percent": 38 00:24:37.049 } 00:24:37.049 }, 00:24:37.049 "base_bdevs_list": [ 00:24:37.049 { 00:24:37.049 "name": "spare", 00:24:37.049 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:37.049 "is_configured": true, 00:24:37.049 "data_offset": 2048, 00:24:37.049 "data_size": 63488 00:24:37.049 }, 00:24:37.049 { 00:24:37.049 "name": "BaseBdev2", 00:24:37.049 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:37.049 "is_configured": true, 00:24:37.049 "data_offset": 2048, 00:24:37.049 "data_size": 63488 00:24:37.049 } 00:24:37.049 ] 00:24:37.049 }' 00:24:37.049 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:37.049 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:37.049 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:37.308 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:37.308 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:37.308 [2024-07-15 20:38:29.660750] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:37.308 [2024-07-15 20:38:29.671011] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:37.309 [2024-07-15 20:38:29.671054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:37.309 [2024-07-15 20:38:29.671069] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:37.309 [2024-07-15 20:38:29.671078] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.567 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:37.826 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:37.826 "name": "raid_bdev1", 00:24:37.826 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:37.826 "strip_size_kb": 0, 00:24:37.826 "state": "online", 00:24:37.826 "raid_level": "raid1", 00:24:37.826 "superblock": true, 00:24:37.826 "num_base_bdevs": 2, 00:24:37.826 "num_base_bdevs_discovered": 1, 00:24:37.826 "num_base_bdevs_operational": 1, 00:24:37.826 "base_bdevs_list": [ 00:24:37.826 { 00:24:37.826 "name": null, 00:24:37.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:37.826 "is_configured": false, 00:24:37.826 "data_offset": 2048, 00:24:37.826 "data_size": 63488 00:24:37.826 }, 00:24:37.826 { 00:24:37.826 "name": "BaseBdev2", 00:24:37.826 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:37.826 "is_configured": true, 00:24:37.826 "data_offset": 2048, 00:24:37.826 "data_size": 63488 00:24:37.826 } 00:24:37.826 ] 00:24:37.826 }' 00:24:37.826 20:38:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:37.826 20:38:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:38.394 20:38:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:38.394 [2024-07-15 20:38:30.762159] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:38.394 [2024-07-15 20:38:30.762215] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.394 [2024-07-15 20:38:30.762241] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d6c730 00:24:38.394 [2024-07-15 20:38:30.762254] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.394 [2024-07-15 20:38:30.762639] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.394 [2024-07-15 20:38:30.762657] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:38.394 [2024-07-15 20:38:30.762739] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:38.394 [2024-07-15 20:38:30.762752] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:38.394 [2024-07-15 20:38:30.762763] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:38.394 [2024-07-15 20:38:30.762783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:38.394 [2024-07-15 20:38:30.767682] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d6daa0 00:24:38.394 [2024-07-15 20:38:30.769146] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:38.394 spare 00:24:38.652 20:38:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:39.593 20:38:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:39.593 20:38:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:39.593 20:38:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:39.593 20:38:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:39.593 20:38:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:39.593 20:38:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.593 20:38:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.852 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:39.852 "name": "raid_bdev1", 00:24:39.852 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:39.852 "strip_size_kb": 0, 00:24:39.852 "state": "online", 00:24:39.852 "raid_level": "raid1", 00:24:39.852 "superblock": true, 00:24:39.852 "num_base_bdevs": 2, 00:24:39.852 "num_base_bdevs_discovered": 2, 00:24:39.852 "num_base_bdevs_operational": 2, 00:24:39.852 "process": { 00:24:39.852 "type": "rebuild", 00:24:39.852 "target": "spare", 00:24:39.852 "progress": { 00:24:39.852 "blocks": 24576, 00:24:39.852 "percent": 38 00:24:39.852 } 00:24:39.852 }, 00:24:39.852 "base_bdevs_list": [ 00:24:39.852 { 00:24:39.852 "name": "spare", 00:24:39.852 "uuid": "aaaad4b6-2d58-565a-b464-c57674c2e356", 00:24:39.852 "is_configured": true, 00:24:39.852 "data_offset": 2048, 00:24:39.852 "data_size": 63488 00:24:39.852 }, 00:24:39.852 { 00:24:39.852 "name": "BaseBdev2", 00:24:39.852 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:39.852 "is_configured": true, 00:24:39.852 "data_offset": 2048, 00:24:39.852 "data_size": 63488 00:24:39.852 } 00:24:39.852 ] 00:24:39.852 }' 00:24:39.852 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:39.852 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:39.852 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:39.852 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:39.852 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:40.111 [2024-07-15 20:38:32.376475] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:40.111 [2024-07-15 20:38:32.381775] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:40.111 [2024-07-15 20:38:32.381818] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:40.111 [2024-07-15 20:38:32.381833] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:40.111 [2024-07-15 20:38:32.381841] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.111 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:40.370 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:40.370 "name": "raid_bdev1", 00:24:40.370 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:40.370 "strip_size_kb": 0, 00:24:40.370 "state": "online", 00:24:40.370 "raid_level": "raid1", 00:24:40.370 "superblock": true, 00:24:40.370 "num_base_bdevs": 2, 00:24:40.370 "num_base_bdevs_discovered": 1, 00:24:40.370 "num_base_bdevs_operational": 1, 00:24:40.370 "base_bdevs_list": [ 00:24:40.370 { 00:24:40.370 "name": null, 00:24:40.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:40.370 "is_configured": false, 00:24:40.370 "data_offset": 2048, 00:24:40.370 "data_size": 63488 00:24:40.370 }, 00:24:40.370 { 00:24:40.370 "name": "BaseBdev2", 00:24:40.370 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:40.370 "is_configured": true, 00:24:40.370 "data_offset": 2048, 00:24:40.370 "data_size": 63488 00:24:40.370 } 00:24:40.370 ] 00:24:40.370 }' 00:24:40.370 20:38:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:40.370 20:38:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:40.937 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:40.937 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:40.937 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:40.937 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:40.937 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:40.937 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:40.937 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.195 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.195 "name": "raid_bdev1", 00:24:41.195 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:41.195 "strip_size_kb": 0, 00:24:41.196 "state": "online", 00:24:41.196 "raid_level": "raid1", 00:24:41.196 "superblock": true, 00:24:41.196 "num_base_bdevs": 2, 00:24:41.196 "num_base_bdevs_discovered": 1, 00:24:41.196 "num_base_bdevs_operational": 1, 00:24:41.196 "base_bdevs_list": [ 00:24:41.196 { 00:24:41.196 "name": null, 00:24:41.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.196 "is_configured": false, 00:24:41.196 "data_offset": 2048, 00:24:41.196 "data_size": 63488 00:24:41.196 }, 00:24:41.196 { 00:24:41.196 "name": "BaseBdev2", 00:24:41.196 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:41.196 "is_configured": true, 00:24:41.196 "data_offset": 2048, 00:24:41.196 "data_size": 63488 00:24:41.196 } 00:24:41.196 ] 00:24:41.196 }' 00:24:41.196 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.196 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:41.196 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.196 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:41.196 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:41.568 20:38:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:41.862 [2024-07-15 20:38:33.994294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:41.862 [2024-07-15 20:38:33.994347] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:41.862 [2024-07-15 20:38:33.994372] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d67650 00:24:41.862 [2024-07-15 20:38:33.994384] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:41.862 [2024-07-15 20:38:33.994751] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:41.862 [2024-07-15 20:38:33.994770] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:41.862 [2024-07-15 20:38:33.994837] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:41.862 [2024-07-15 20:38:33.994851] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:41.862 [2024-07-15 20:38:33.994862] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:41.862 BaseBdev1 00:24:41.862 20:38:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.816 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.075 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:43.075 "name": "raid_bdev1", 00:24:43.075 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:43.075 "strip_size_kb": 0, 00:24:43.075 "state": "online", 00:24:43.075 "raid_level": "raid1", 00:24:43.075 "superblock": true, 00:24:43.075 "num_base_bdevs": 2, 00:24:43.075 "num_base_bdevs_discovered": 1, 00:24:43.075 "num_base_bdevs_operational": 1, 00:24:43.075 "base_bdevs_list": [ 00:24:43.075 { 00:24:43.075 "name": null, 00:24:43.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.075 "is_configured": false, 00:24:43.075 "data_offset": 2048, 00:24:43.075 "data_size": 63488 00:24:43.075 }, 00:24:43.075 { 00:24:43.075 "name": "BaseBdev2", 00:24:43.075 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:43.075 "is_configured": true, 00:24:43.075 "data_offset": 2048, 00:24:43.075 "data_size": 63488 00:24:43.075 } 00:24:43.075 ] 00:24:43.075 }' 00:24:43.075 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:43.075 20:38:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:43.643 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:43.643 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:43.643 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:43.643 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:43.643 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:43.643 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.643 20:38:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.901 20:38:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:43.901 "name": "raid_bdev1", 00:24:43.902 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:43.902 "strip_size_kb": 0, 00:24:43.902 "state": "online", 00:24:43.902 "raid_level": "raid1", 00:24:43.902 "superblock": true, 00:24:43.902 "num_base_bdevs": 2, 00:24:43.902 "num_base_bdevs_discovered": 1, 00:24:43.902 "num_base_bdevs_operational": 1, 00:24:43.902 "base_bdevs_list": [ 00:24:43.902 { 00:24:43.902 "name": null, 00:24:43.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.902 "is_configured": false, 00:24:43.902 "data_offset": 2048, 00:24:43.902 "data_size": 63488 00:24:43.902 }, 00:24:43.902 { 00:24:43.902 "name": "BaseBdev2", 00:24:43.902 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:43.902 "is_configured": true, 00:24:43.902 "data_offset": 2048, 00:24:43.902 "data_size": 63488 00:24:43.902 } 00:24:43.902 ] 00:24:43.902 }' 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:43.902 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:44.160 [2024-07-15 20:38:36.424773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:44.160 [2024-07-15 20:38:36.424903] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:44.160 [2024-07-15 20:38:36.424919] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:44.160 request: 00:24:44.160 { 00:24:44.160 "base_bdev": "BaseBdev1", 00:24:44.161 "raid_bdev": "raid_bdev1", 00:24:44.161 "method": "bdev_raid_add_base_bdev", 00:24:44.161 "req_id": 1 00:24:44.161 } 00:24:44.161 Got JSON-RPC error response 00:24:44.161 response: 00:24:44.161 { 00:24:44.161 "code": -22, 00:24:44.161 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:44.161 } 00:24:44.161 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:24:44.161 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:44.161 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:44.161 20:38:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:44.161 20:38:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.097 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.356 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:45.356 "name": "raid_bdev1", 00:24:45.356 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:45.356 "strip_size_kb": 0, 00:24:45.356 "state": "online", 00:24:45.356 "raid_level": "raid1", 00:24:45.356 "superblock": true, 00:24:45.356 "num_base_bdevs": 2, 00:24:45.356 "num_base_bdevs_discovered": 1, 00:24:45.356 "num_base_bdevs_operational": 1, 00:24:45.356 "base_bdevs_list": [ 00:24:45.356 { 00:24:45.356 "name": null, 00:24:45.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.356 "is_configured": false, 00:24:45.356 "data_offset": 2048, 00:24:45.356 "data_size": 63488 00:24:45.356 }, 00:24:45.356 { 00:24:45.356 "name": "BaseBdev2", 00:24:45.356 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:45.356 "is_configured": true, 00:24:45.356 "data_offset": 2048, 00:24:45.356 "data_size": 63488 00:24:45.356 } 00:24:45.356 ] 00:24:45.356 }' 00:24:45.356 20:38:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:45.356 20:38:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:46.292 "name": "raid_bdev1", 00:24:46.292 "uuid": "25dbe2d2-d6e0-4e82-a8fe-3a60dd0a5326", 00:24:46.292 "strip_size_kb": 0, 00:24:46.292 "state": "online", 00:24:46.292 "raid_level": "raid1", 00:24:46.292 "superblock": true, 00:24:46.292 "num_base_bdevs": 2, 00:24:46.292 "num_base_bdevs_discovered": 1, 00:24:46.292 "num_base_bdevs_operational": 1, 00:24:46.292 "base_bdevs_list": [ 00:24:46.292 { 00:24:46.292 "name": null, 00:24:46.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.292 "is_configured": false, 00:24:46.292 "data_offset": 2048, 00:24:46.292 "data_size": 63488 00:24:46.292 }, 00:24:46.292 { 00:24:46.292 "name": "BaseBdev2", 00:24:46.292 "uuid": "b29c6917-1d9e-5b8d-8e6f-c68cbf2379ed", 00:24:46.292 "is_configured": true, 00:24:46.292 "data_offset": 2048, 00:24:46.292 "data_size": 63488 00:24:46.292 } 00:24:46.292 ] 00:24:46.292 }' 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1463748 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1463748 ']' 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 1463748 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:46.292 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1463748 00:24:46.551 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:46.551 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:46.551 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1463748' 00:24:46.551 killing process with pid 1463748 00:24:46.551 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 1463748 00:24:46.551 Received shutdown signal, test time was about 60.000000 seconds 00:24:46.551 00:24:46.551 Latency(us) 00:24:46.551 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:46.551 =================================================================================================================== 00:24:46.551 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:46.551 [2024-07-15 20:38:38.698711] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:46.551 [2024-07-15 20:38:38.698817] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:46.551 [2024-07-15 20:38:38.698857] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to fr 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 1463748 00:24:46.551 ee all in destruct 00:24:46.551 [2024-07-15 20:38:38.698872] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d6b260 name raid_bdev1, state offline 00:24:46.551 [2024-07-15 20:38:38.724823] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:46.810 20:38:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:24:46.810 00:24:46.810 real 0m37.103s 00:24:46.810 user 0m53.255s 00:24:46.810 sys 0m7.270s 00:24:46.810 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:46.810 20:38:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:46.810 ************************************ 00:24:46.810 END TEST raid_rebuild_test_sb 00:24:46.810 ************************************ 00:24:46.811 20:38:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:46.811 20:38:38 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:24:46.811 20:38:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:46.811 20:38:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:46.811 20:38:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:46.811 ************************************ 00:24:46.811 START TEST raid_rebuild_test_io 00:24:46.811 ************************************ 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1468930 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1468930 /var/tmp/spdk-raid.sock 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 1468930 ']' 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:46.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:46.811 20:38:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:46.811 [2024-07-15 20:38:39.086917] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:24:46.811 [2024-07-15 20:38:39.086985] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1468930 ] 00:24:46.811 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:46.811 Zero copy mechanism will not be used. 00:24:47.070 [2024-07-15 20:38:39.216017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:47.070 [2024-07-15 20:38:39.313539] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:47.070 [2024-07-15 20:38:39.377745] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:47.070 [2024-07-15 20:38:39.377785] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:47.637 20:38:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:47.895 20:38:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:24:47.895 20:38:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:47.895 20:38:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:47.895 BaseBdev1_malloc 00:24:48.154 20:38:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:48.154 [2024-07-15 20:38:40.506186] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:48.154 [2024-07-15 20:38:40.506233] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:48.154 [2024-07-15 20:38:40.506258] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe7ad40 00:24:48.154 [2024-07-15 20:38:40.506270] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:48.154 [2024-07-15 20:38:40.508007] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:48.154 [2024-07-15 20:38:40.508036] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:48.154 BaseBdev1 00:24:48.154 20:38:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:48.154 20:38:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:48.412 BaseBdev2_malloc 00:24:48.412 20:38:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:48.671 [2024-07-15 20:38:41.005290] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:48.671 [2024-07-15 20:38:41.005338] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:48.671 [2024-07-15 20:38:41.005363] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe7b860 00:24:48.671 [2024-07-15 20:38:41.005375] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:48.671 [2024-07-15 20:38:41.006973] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:48.671 [2024-07-15 20:38:41.007000] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:48.671 BaseBdev2 00:24:48.671 20:38:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:48.929 spare_malloc 00:24:48.929 20:38:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:49.188 spare_delay 00:24:49.188 20:38:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:49.446 [2024-07-15 20:38:41.741060] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:49.446 [2024-07-15 20:38:41.741108] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:49.446 [2024-07-15 20:38:41.741130] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1029ec0 00:24:49.446 [2024-07-15 20:38:41.741142] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:49.446 [2024-07-15 20:38:41.742757] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:49.446 [2024-07-15 20:38:41.742786] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:49.446 spare 00:24:49.446 20:38:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:49.704 [2024-07-15 20:38:41.981712] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:49.704 [2024-07-15 20:38:41.983038] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:49.704 [2024-07-15 20:38:41.983119] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x102b070 00:24:49.704 [2024-07-15 20:38:41.983130] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:49.704 [2024-07-15 20:38:41.983336] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1024490 00:24:49.704 [2024-07-15 20:38:41.983478] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x102b070 00:24:49.704 [2024-07-15 20:38:41.983488] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x102b070 00:24:49.704 [2024-07-15 20:38:41.983607] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:49.704 20:38:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:49.704 20:38:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:49.704 20:38:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:49.704 20:38:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:49.704 20:38:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:49.704 20:38:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:49.704 20:38:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:49.704 20:38:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:49.704 20:38:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:49.704 20:38:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:49.704 20:38:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.704 20:38:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.963 20:38:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:49.963 "name": "raid_bdev1", 00:24:49.963 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:24:49.963 "strip_size_kb": 0, 00:24:49.963 "state": "online", 00:24:49.963 "raid_level": "raid1", 00:24:49.963 "superblock": false, 00:24:49.963 "num_base_bdevs": 2, 00:24:49.963 "num_base_bdevs_discovered": 2, 00:24:49.963 "num_base_bdevs_operational": 2, 00:24:49.963 "base_bdevs_list": [ 00:24:49.963 { 00:24:49.963 "name": "BaseBdev1", 00:24:49.963 "uuid": "c8ae0a9b-2a4f-5967-9b5d-4d5ca4677a34", 00:24:49.963 "is_configured": true, 00:24:49.963 "data_offset": 0, 00:24:49.963 "data_size": 65536 00:24:49.963 }, 00:24:49.963 { 00:24:49.963 "name": "BaseBdev2", 00:24:49.963 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:24:49.963 "is_configured": true, 00:24:49.963 "data_offset": 0, 00:24:49.963 "data_size": 65536 00:24:49.963 } 00:24:49.963 ] 00:24:49.963 }' 00:24:49.963 20:38:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:49.963 20:38:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:50.529 20:38:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:50.529 20:38:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:50.788 [2024-07-15 20:38:43.064824] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:50.788 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:50.788 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.788 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:51.047 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:51.047 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:51.047 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:51.047 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:51.304 [2024-07-15 20:38:43.447986] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1025bd0 00:24:51.304 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:51.304 Zero copy mechanism will not be used. 00:24:51.304 Running I/O for 60 seconds... 00:24:51.304 [2024-07-15 20:38:43.565376] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:51.304 [2024-07-15 20:38:43.573534] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1025bd0 00:24:51.304 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:51.304 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:51.305 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:51.305 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:51.305 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:51.305 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:51.305 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:51.305 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:51.305 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:51.305 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:51.305 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.305 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.562 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:51.562 "name": "raid_bdev1", 00:24:51.562 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:24:51.562 "strip_size_kb": 0, 00:24:51.563 "state": "online", 00:24:51.563 "raid_level": "raid1", 00:24:51.563 "superblock": false, 00:24:51.563 "num_base_bdevs": 2, 00:24:51.563 "num_base_bdevs_discovered": 1, 00:24:51.563 "num_base_bdevs_operational": 1, 00:24:51.563 "base_bdevs_list": [ 00:24:51.563 { 00:24:51.563 "name": null, 00:24:51.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.563 "is_configured": false, 00:24:51.563 "data_offset": 0, 00:24:51.563 "data_size": 65536 00:24:51.563 }, 00:24:51.563 { 00:24:51.563 "name": "BaseBdev2", 00:24:51.563 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:24:51.563 "is_configured": true, 00:24:51.563 "data_offset": 0, 00:24:51.563 "data_size": 65536 00:24:51.563 } 00:24:51.563 ] 00:24:51.563 }' 00:24:51.563 20:38:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:51.563 20:38:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:52.129 20:38:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:52.388 [2024-07-15 20:38:44.736619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:52.646 [2024-07-15 20:38:44.796349] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfad8b0 00:24:52.646 20:38:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:52.646 [2024-07-15 20:38:44.798755] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:52.646 [2024-07-15 20:38:44.917687] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:52.646 [2024-07-15 20:38:44.918009] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:52.904 [2024-07-15 20:38:45.028803] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:52.905 [2024-07-15 20:38:45.029058] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:53.162 [2024-07-15 20:38:45.397471] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:53.162 [2024-07-15 20:38:45.535434] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:53.420 [2024-07-15 20:38:45.780404] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:53.678 20:38:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:53.678 20:38:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:53.678 20:38:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:53.678 20:38:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:53.678 20:38:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:53.678 20:38:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.678 20:38:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.936 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:53.936 "name": "raid_bdev1", 00:24:53.936 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:24:53.936 "strip_size_kb": 0, 00:24:53.936 "state": "online", 00:24:53.936 "raid_level": "raid1", 00:24:53.936 "superblock": false, 00:24:53.936 "num_base_bdevs": 2, 00:24:53.936 "num_base_bdevs_discovered": 2, 00:24:53.936 "num_base_bdevs_operational": 2, 00:24:53.936 "process": { 00:24:53.936 "type": "rebuild", 00:24:53.936 "target": "spare", 00:24:53.936 "progress": { 00:24:53.936 "blocks": 16384, 00:24:53.936 "percent": 25 00:24:53.936 } 00:24:53.936 }, 00:24:53.936 "base_bdevs_list": [ 00:24:53.936 { 00:24:53.936 "name": "spare", 00:24:53.936 "uuid": "4641dedc-2d1f-5226-b890-0bcd07e114a0", 00:24:53.936 "is_configured": true, 00:24:53.936 "data_offset": 0, 00:24:53.936 "data_size": 65536 00:24:53.936 }, 00:24:53.936 { 00:24:53.936 "name": "BaseBdev2", 00:24:53.936 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:24:53.936 "is_configured": true, 00:24:53.936 "data_offset": 0, 00:24:53.936 "data_size": 65536 00:24:53.936 } 00:24:53.936 ] 00:24:53.936 }' 00:24:53.936 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:53.936 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:53.936 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:53.936 [2024-07-15 20:38:46.155321] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:53.937 [2024-07-15 20:38:46.155696] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:53.937 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:53.937 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:53.937 [2024-07-15 20:38:46.282606] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:54.195 [2024-07-15 20:38:46.400902] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:54.195 [2024-07-15 20:38:46.410074] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:54.195 [2024-07-15 20:38:46.526655] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:54.195 [2024-07-15 20:38:46.536654] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:54.195 [2024-07-15 20:38:46.536679] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:54.195 [2024-07-15 20:38:46.536689] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:54.195 [2024-07-15 20:38:46.559130] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1025bd0 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.454 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.713 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:54.713 "name": "raid_bdev1", 00:24:54.713 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:24:54.713 "strip_size_kb": 0, 00:24:54.713 "state": "online", 00:24:54.713 "raid_level": "raid1", 00:24:54.713 "superblock": false, 00:24:54.713 "num_base_bdevs": 2, 00:24:54.713 "num_base_bdevs_discovered": 1, 00:24:54.713 "num_base_bdevs_operational": 1, 00:24:54.713 "base_bdevs_list": [ 00:24:54.713 { 00:24:54.713 "name": null, 00:24:54.713 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.713 "is_configured": false, 00:24:54.713 "data_offset": 0, 00:24:54.713 "data_size": 65536 00:24:54.713 }, 00:24:54.713 { 00:24:54.713 "name": "BaseBdev2", 00:24:54.713 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:24:54.713 "is_configured": true, 00:24:54.713 "data_offset": 0, 00:24:54.713 "data_size": 65536 00:24:54.713 } 00:24:54.713 ] 00:24:54.713 }' 00:24:54.713 20:38:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:54.713 20:38:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:55.280 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:55.280 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:55.280 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:55.280 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:55.280 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:55.280 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.280 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.539 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:55.539 "name": "raid_bdev1", 00:24:55.539 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:24:55.539 "strip_size_kb": 0, 00:24:55.539 "state": "online", 00:24:55.539 "raid_level": "raid1", 00:24:55.539 "superblock": false, 00:24:55.539 "num_base_bdevs": 2, 00:24:55.539 "num_base_bdevs_discovered": 1, 00:24:55.539 "num_base_bdevs_operational": 1, 00:24:55.539 "base_bdevs_list": [ 00:24:55.539 { 00:24:55.539 "name": null, 00:24:55.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.539 "is_configured": false, 00:24:55.539 "data_offset": 0, 00:24:55.539 "data_size": 65536 00:24:55.539 }, 00:24:55.539 { 00:24:55.539 "name": "BaseBdev2", 00:24:55.539 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:24:55.539 "is_configured": true, 00:24:55.539 "data_offset": 0, 00:24:55.539 "data_size": 65536 00:24:55.539 } 00:24:55.539 ] 00:24:55.539 }' 00:24:55.539 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:55.539 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:55.539 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:55.539 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:55.540 20:38:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:55.799 [2024-07-15 20:38:48.046923] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:55.799 20:38:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:55.799 [2024-07-15 20:38:48.106037] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x102b450 00:24:55.799 [2024-07-15 20:38:48.107491] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:56.058 [2024-07-15 20:38:48.217211] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:56.058 [2024-07-15 20:38:48.217536] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:56.058 [2024-07-15 20:38:48.337490] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:56.058 [2024-07-15 20:38:48.337704] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:56.625 [2024-07-15 20:38:48.697167] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:56.884 [2024-07-15 20:38:49.026669] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:56.884 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:56.884 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:56.884 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:56.884 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:56.884 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:56.884 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.884 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.884 [2024-07-15 20:38:49.256876] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:57.143 "name": "raid_bdev1", 00:24:57.143 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:24:57.143 "strip_size_kb": 0, 00:24:57.143 "state": "online", 00:24:57.143 "raid_level": "raid1", 00:24:57.143 "superblock": false, 00:24:57.143 "num_base_bdevs": 2, 00:24:57.143 "num_base_bdevs_discovered": 2, 00:24:57.143 "num_base_bdevs_operational": 2, 00:24:57.143 "process": { 00:24:57.143 "type": "rebuild", 00:24:57.143 "target": "spare", 00:24:57.143 "progress": { 00:24:57.143 "blocks": 16384, 00:24:57.143 "percent": 25 00:24:57.143 } 00:24:57.143 }, 00:24:57.143 "base_bdevs_list": [ 00:24:57.143 { 00:24:57.143 "name": "spare", 00:24:57.143 "uuid": "4641dedc-2d1f-5226-b890-0bcd07e114a0", 00:24:57.143 "is_configured": true, 00:24:57.143 "data_offset": 0, 00:24:57.143 "data_size": 65536 00:24:57.143 }, 00:24:57.143 { 00:24:57.143 "name": "BaseBdev2", 00:24:57.143 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:24:57.143 "is_configured": true, 00:24:57.143 "data_offset": 0, 00:24:57.143 "data_size": 65536 00:24:57.143 } 00:24:57.143 ] 00:24:57.143 }' 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=867 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.143 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.402 [2024-07-15 20:38:49.653851] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:57.402 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:57.402 "name": "raid_bdev1", 00:24:57.402 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:24:57.402 "strip_size_kb": 0, 00:24:57.402 "state": "online", 00:24:57.402 "raid_level": "raid1", 00:24:57.402 "superblock": false, 00:24:57.402 "num_base_bdevs": 2, 00:24:57.402 "num_base_bdevs_discovered": 2, 00:24:57.402 "num_base_bdevs_operational": 2, 00:24:57.402 "process": { 00:24:57.402 "type": "rebuild", 00:24:57.402 "target": "spare", 00:24:57.402 "progress": { 00:24:57.402 "blocks": 20480, 00:24:57.402 "percent": 31 00:24:57.402 } 00:24:57.402 }, 00:24:57.402 "base_bdevs_list": [ 00:24:57.402 { 00:24:57.402 "name": "spare", 00:24:57.402 "uuid": "4641dedc-2d1f-5226-b890-0bcd07e114a0", 00:24:57.402 "is_configured": true, 00:24:57.402 "data_offset": 0, 00:24:57.402 "data_size": 65536 00:24:57.402 }, 00:24:57.402 { 00:24:57.402 "name": "BaseBdev2", 00:24:57.402 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:24:57.402 "is_configured": true, 00:24:57.402 "data_offset": 0, 00:24:57.402 "data_size": 65536 00:24:57.402 } 00:24:57.402 ] 00:24:57.402 }' 00:24:57.402 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:57.403 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:57.403 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:57.662 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:57.662 20:38:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:57.920 [2024-07-15 20:38:50.155579] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:58.179 [2024-07-15 20:38:50.393640] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:24:58.437 [2024-07-15 20:38:50.612534] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:58.437 [2024-07-15 20:38:50.612757] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:58.437 20:38:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:58.437 20:38:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:58.437 20:38:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.437 20:38:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:58.437 20:38:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:58.437 20:38:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.437 20:38:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.437 20:38:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.696 [2024-07-15 20:38:50.934648] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:58.696 [2024-07-15 20:38:51.036333] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:24:58.696 20:38:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.696 "name": "raid_bdev1", 00:24:58.696 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:24:58.696 "strip_size_kb": 0, 00:24:58.696 "state": "online", 00:24:58.696 "raid_level": "raid1", 00:24:58.696 "superblock": false, 00:24:58.696 "num_base_bdevs": 2, 00:24:58.696 "num_base_bdevs_discovered": 2, 00:24:58.696 "num_base_bdevs_operational": 2, 00:24:58.696 "process": { 00:24:58.696 "type": "rebuild", 00:24:58.696 "target": "spare", 00:24:58.696 "progress": { 00:24:58.696 "blocks": 38912, 00:24:58.696 "percent": 59 00:24:58.696 } 00:24:58.696 }, 00:24:58.696 "base_bdevs_list": [ 00:24:58.696 { 00:24:58.696 "name": "spare", 00:24:58.696 "uuid": "4641dedc-2d1f-5226-b890-0bcd07e114a0", 00:24:58.696 "is_configured": true, 00:24:58.696 "data_offset": 0, 00:24:58.696 "data_size": 65536 00:24:58.696 }, 00:24:58.696 { 00:24:58.696 "name": "BaseBdev2", 00:24:58.696 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:24:58.696 "is_configured": true, 00:24:58.696 "data_offset": 0, 00:24:58.696 "data_size": 65536 00:24:58.696 } 00:24:58.696 ] 00:24:58.696 }' 00:24:58.696 20:38:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.954 20:38:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:58.954 20:38:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.954 20:38:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:58.954 20:38:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:58.954 [2024-07-15 20:38:51.257677] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:58.954 [2024-07-15 20:38:51.257994] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:59.213 [2024-07-15 20:38:51.385503] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:00.151 "name": "raid_bdev1", 00:25:00.151 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:25:00.151 "strip_size_kb": 0, 00:25:00.151 "state": "online", 00:25:00.151 "raid_level": "raid1", 00:25:00.151 "superblock": false, 00:25:00.151 "num_base_bdevs": 2, 00:25:00.151 "num_base_bdevs_discovered": 2, 00:25:00.151 "num_base_bdevs_operational": 2, 00:25:00.151 "process": { 00:25:00.151 "type": "rebuild", 00:25:00.151 "target": "spare", 00:25:00.151 "progress": { 00:25:00.151 "blocks": 61440, 00:25:00.151 "percent": 93 00:25:00.151 } 00:25:00.151 }, 00:25:00.151 "base_bdevs_list": [ 00:25:00.151 { 00:25:00.151 "name": "spare", 00:25:00.151 "uuid": "4641dedc-2d1f-5226-b890-0bcd07e114a0", 00:25:00.151 "is_configured": true, 00:25:00.151 "data_offset": 0, 00:25:00.151 "data_size": 65536 00:25:00.151 }, 00:25:00.151 { 00:25:00.151 "name": "BaseBdev2", 00:25:00.151 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:25:00.151 "is_configured": true, 00:25:00.151 "data_offset": 0, 00:25:00.151 "data_size": 65536 00:25:00.151 } 00:25:00.151 ] 00:25:00.151 }' 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:00.151 [2024-07-15 20:38:52.510081] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:00.151 20:38:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:00.410 [2024-07-15 20:38:52.618321] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:00.410 [2024-07-15 20:38:52.620833] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:01.345 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:01.345 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:01.345 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:01.345 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:01.345 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:01.345 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:01.345 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.345 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.602 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.602 "name": "raid_bdev1", 00:25:01.602 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:25:01.602 "strip_size_kb": 0, 00:25:01.602 "state": "online", 00:25:01.602 "raid_level": "raid1", 00:25:01.602 "superblock": false, 00:25:01.602 "num_base_bdevs": 2, 00:25:01.602 "num_base_bdevs_discovered": 2, 00:25:01.602 "num_base_bdevs_operational": 2, 00:25:01.602 "base_bdevs_list": [ 00:25:01.602 { 00:25:01.602 "name": "spare", 00:25:01.602 "uuid": "4641dedc-2d1f-5226-b890-0bcd07e114a0", 00:25:01.602 "is_configured": true, 00:25:01.602 "data_offset": 0, 00:25:01.602 "data_size": 65536 00:25:01.602 }, 00:25:01.602 { 00:25:01.602 "name": "BaseBdev2", 00:25:01.602 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:25:01.602 "is_configured": true, 00:25:01.602 "data_offset": 0, 00:25:01.602 "data_size": 65536 00:25:01.602 } 00:25:01.602 ] 00:25:01.602 }' 00:25:01.602 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.602 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:01.602 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.602 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:01.602 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:25:01.602 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:01.602 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:01.603 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:01.603 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:01.603 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:01.603 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.603 20:38:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.861 "name": "raid_bdev1", 00:25:01.861 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:25:01.861 "strip_size_kb": 0, 00:25:01.861 "state": "online", 00:25:01.861 "raid_level": "raid1", 00:25:01.861 "superblock": false, 00:25:01.861 "num_base_bdevs": 2, 00:25:01.861 "num_base_bdevs_discovered": 2, 00:25:01.861 "num_base_bdevs_operational": 2, 00:25:01.861 "base_bdevs_list": [ 00:25:01.861 { 00:25:01.861 "name": "spare", 00:25:01.861 "uuid": "4641dedc-2d1f-5226-b890-0bcd07e114a0", 00:25:01.861 "is_configured": true, 00:25:01.861 "data_offset": 0, 00:25:01.861 "data_size": 65536 00:25:01.861 }, 00:25:01.861 { 00:25:01.861 "name": "BaseBdev2", 00:25:01.861 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:25:01.861 "is_configured": true, 00:25:01.861 "data_offset": 0, 00:25:01.861 "data_size": 65536 00:25:01.861 } 00:25:01.861 ] 00:25:01.861 }' 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.861 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.120 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:02.120 "name": "raid_bdev1", 00:25:02.120 "uuid": "aec629e9-b9f2-4fc2-a1a4-f135b2089406", 00:25:02.120 "strip_size_kb": 0, 00:25:02.120 "state": "online", 00:25:02.120 "raid_level": "raid1", 00:25:02.120 "superblock": false, 00:25:02.120 "num_base_bdevs": 2, 00:25:02.120 "num_base_bdevs_discovered": 2, 00:25:02.120 "num_base_bdevs_operational": 2, 00:25:02.120 "base_bdevs_list": [ 00:25:02.120 { 00:25:02.120 "name": "spare", 00:25:02.120 "uuid": "4641dedc-2d1f-5226-b890-0bcd07e114a0", 00:25:02.120 "is_configured": true, 00:25:02.120 "data_offset": 0, 00:25:02.120 "data_size": 65536 00:25:02.120 }, 00:25:02.120 { 00:25:02.120 "name": "BaseBdev2", 00:25:02.120 "uuid": "e674a8ca-7942-54bf-a610-ff4938ffc350", 00:25:02.120 "is_configured": true, 00:25:02.120 "data_offset": 0, 00:25:02.120 "data_size": 65536 00:25:02.120 } 00:25:02.120 ] 00:25:02.120 }' 00:25:02.120 20:38:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:02.120 20:38:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:03.057 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:03.057 [2024-07-15 20:38:55.295866] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:03.057 [2024-07-15 20:38:55.295903] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:03.057 00:25:03.057 Latency(us) 00:25:03.057 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:03.057 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:03.057 raid_bdev1 : 11.89 91.77 275.30 0.00 0.00 14898.65 290.28 111240.24 00:25:03.057 =================================================================================================================== 00:25:03.057 Total : 91.77 275.30 0.00 0.00 14898.65 290.28 111240.24 00:25:03.057 [2024-07-15 20:38:55.372000] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:03.057 [2024-07-15 20:38:55.372030] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:03.057 [2024-07-15 20:38:55.372105] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:03.057 [2024-07-15 20:38:55.372116] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x102b070 name raid_bdev1, state offline 00:25:03.057 0 00:25:03.057 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.057 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:03.317 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:03.576 /dev/nbd0 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:03.576 1+0 records in 00:25:03.576 1+0 records out 00:25:03.576 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290707 s, 14.1 MB/s 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:03.576 20:38:55 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:25:03.855 /dev/nbd1 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:03.855 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:03.855 1+0 records in 00:25:03.855 1+0 records out 00:25:03.855 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275218 s, 14.9 MB/s 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:04.142 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:04.400 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:04.401 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:04.401 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:04.401 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:04.401 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1468930 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 1468930 ']' 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 1468930 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1468930 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1468930' 00:25:04.660 killing process with pid 1468930 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 1468930 00:25:04.660 Received shutdown signal, test time was about 13.377117 seconds 00:25:04.660 00:25:04.660 Latency(us) 00:25:04.660 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:04.660 =================================================================================================================== 00:25:04.660 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:04.660 [2024-07-15 20:38:56.860076] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:04.660 20:38:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 1468930 00:25:04.660 [2024-07-15 20:38:56.881924] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:04.919 00:25:04.919 real 0m18.099s 00:25:04.919 user 0m27.528s 00:25:04.919 sys 0m2.837s 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:04.919 ************************************ 00:25:04.919 END TEST raid_rebuild_test_io 00:25:04.919 ************************************ 00:25:04.919 20:38:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:04.919 20:38:57 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:25:04.919 20:38:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:04.919 20:38:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:04.919 20:38:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:04.919 ************************************ 00:25:04.919 START TEST raid_rebuild_test_sb_io 00:25:04.919 ************************************ 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1471578 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1471578 /var/tmp/spdk-raid.sock 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 1471578 ']' 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:04.919 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:04.919 20:38:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:04.919 [2024-07-15 20:38:57.276643] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:25:04.919 [2024-07-15 20:38:57.276716] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1471578 ] 00:25:04.919 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:04.919 Zero copy mechanism will not be used. 00:25:05.178 [2024-07-15 20:38:57.407138] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:05.178 [2024-07-15 20:38:57.510432] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:05.437 [2024-07-15 20:38:57.570966] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:05.437 [2024-07-15 20:38:57.571001] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:06.001 20:38:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:06.001 20:38:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:25:06.001 20:38:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:06.001 20:38:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:06.259 BaseBdev1_malloc 00:25:06.259 20:38:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:06.516 [2024-07-15 20:38:58.691903] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:06.516 [2024-07-15 20:38:58.691960] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:06.516 [2024-07-15 20:38:58.691985] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a5dd40 00:25:06.516 [2024-07-15 20:38:58.691998] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:06.516 [2024-07-15 20:38:58.693710] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:06.516 [2024-07-15 20:38:58.693740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:06.516 BaseBdev1 00:25:06.516 20:38:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:06.516 20:38:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:06.773 BaseBdev2_malloc 00:25:06.773 20:38:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:07.030 [2024-07-15 20:38:59.186231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:07.030 [2024-07-15 20:38:59.186279] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:07.030 [2024-07-15 20:38:59.186303] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a5e860 00:25:07.030 [2024-07-15 20:38:59.186315] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:07.030 [2024-07-15 20:38:59.187783] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:07.030 [2024-07-15 20:38:59.187811] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:07.030 BaseBdev2 00:25:07.030 20:38:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:07.300 spare_malloc 00:25:07.300 20:38:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:07.557 spare_delay 00:25:07.557 20:38:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:07.814 [2024-07-15 20:38:59.936942] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:07.814 [2024-07-15 20:38:59.936991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:07.814 [2024-07-15 20:38:59.937014] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c0cec0 00:25:07.814 [2024-07-15 20:38:59.937027] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:07.814 [2024-07-15 20:38:59.938542] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:07.814 [2024-07-15 20:38:59.938569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:07.814 spare 00:25:07.814 20:38:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:07.814 [2024-07-15 20:39:00.189648] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:07.814 [2024-07-15 20:39:00.191008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:07.814 [2024-07-15 20:39:00.191187] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c0e070 00:25:07.814 [2024-07-15 20:39:00.191201] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:07.814 [2024-07-15 20:39:00.191409] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c07490 00:25:07.814 [2024-07-15 20:39:00.191557] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c0e070 00:25:07.814 [2024-07-15 20:39:00.191567] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c0e070 00:25:07.814 [2024-07-15 20:39:00.191671] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:08.071 "name": "raid_bdev1", 00:25:08.071 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:08.071 "strip_size_kb": 0, 00:25:08.071 "state": "online", 00:25:08.071 "raid_level": "raid1", 00:25:08.071 "superblock": true, 00:25:08.071 "num_base_bdevs": 2, 00:25:08.071 "num_base_bdevs_discovered": 2, 00:25:08.071 "num_base_bdevs_operational": 2, 00:25:08.071 "base_bdevs_list": [ 00:25:08.071 { 00:25:08.071 "name": "BaseBdev1", 00:25:08.071 "uuid": "7d275379-f566-5634-975d-f1542790ba00", 00:25:08.071 "is_configured": true, 00:25:08.071 "data_offset": 2048, 00:25:08.071 "data_size": 63488 00:25:08.071 }, 00:25:08.071 { 00:25:08.071 "name": "BaseBdev2", 00:25:08.071 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:08.071 "is_configured": true, 00:25:08.071 "data_offset": 2048, 00:25:08.071 "data_size": 63488 00:25:08.071 } 00:25:08.071 ] 00:25:08.071 }' 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:08.071 20:39:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:09.003 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:09.003 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:09.003 [2024-07-15 20:39:01.256697] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:09.003 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:09.003 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.003 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:09.260 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:09.260 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:09.260 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:09.260 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:09.260 [2024-07-15 20:39:01.635541] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c0ec50 00:25:09.260 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:09.260 Zero copy mechanism will not be used. 00:25:09.260 Running I/O for 60 seconds... 00:25:09.518 [2024-07-15 20:39:01.693880] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:09.518 [2024-07-15 20:39:01.702085] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1c0ec50 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.518 20:39:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.777 20:39:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:09.777 "name": "raid_bdev1", 00:25:09.777 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:09.777 "strip_size_kb": 0, 00:25:09.777 "state": "online", 00:25:09.777 "raid_level": "raid1", 00:25:09.777 "superblock": true, 00:25:09.777 "num_base_bdevs": 2, 00:25:09.777 "num_base_bdevs_discovered": 1, 00:25:09.777 "num_base_bdevs_operational": 1, 00:25:09.777 "base_bdevs_list": [ 00:25:09.777 { 00:25:09.777 "name": null, 00:25:09.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:09.777 "is_configured": false, 00:25:09.777 "data_offset": 2048, 00:25:09.777 "data_size": 63488 00:25:09.777 }, 00:25:09.777 { 00:25:09.777 "name": "BaseBdev2", 00:25:09.777 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:09.777 "is_configured": true, 00:25:09.777 "data_offset": 2048, 00:25:09.777 "data_size": 63488 00:25:09.777 } 00:25:09.777 ] 00:25:09.777 }' 00:25:09.777 20:39:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:09.777 20:39:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:10.343 20:39:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:10.601 [2024-07-15 20:39:02.838354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:10.601 [2024-07-15 20:39:02.889239] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b7a230 00:25:10.601 20:39:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:10.601 [2024-07-15 20:39:02.891603] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:10.860 [2024-07-15 20:39:03.169394] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:10.860 [2024-07-15 20:39:03.169618] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:11.426 [2024-07-15 20:39:03.618884] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:11.426 [2024-07-15 20:39:03.619149] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:11.684 20:39:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:11.684 20:39:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:11.684 20:39:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:11.684 20:39:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:11.684 20:39:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:11.684 20:39:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.684 20:39:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:11.684 [2024-07-15 20:39:04.062301] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:11.942 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:11.942 "name": "raid_bdev1", 00:25:11.942 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:11.942 "strip_size_kb": 0, 00:25:11.942 "state": "online", 00:25:11.942 "raid_level": "raid1", 00:25:11.942 "superblock": true, 00:25:11.942 "num_base_bdevs": 2, 00:25:11.942 "num_base_bdevs_discovered": 2, 00:25:11.942 "num_base_bdevs_operational": 2, 00:25:11.942 "process": { 00:25:11.942 "type": "rebuild", 00:25:11.942 "target": "spare", 00:25:11.942 "progress": { 00:25:11.942 "blocks": 14336, 00:25:11.942 "percent": 22 00:25:11.942 } 00:25:11.942 }, 00:25:11.942 "base_bdevs_list": [ 00:25:11.942 { 00:25:11.942 "name": "spare", 00:25:11.942 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:11.942 "is_configured": true, 00:25:11.942 "data_offset": 2048, 00:25:11.942 "data_size": 63488 00:25:11.942 }, 00:25:11.942 { 00:25:11.942 "name": "BaseBdev2", 00:25:11.942 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:11.942 "is_configured": true, 00:25:11.942 "data_offset": 2048, 00:25:11.942 "data_size": 63488 00:25:11.942 } 00:25:11.942 ] 00:25:11.942 }' 00:25:11.942 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:11.942 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:11.942 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:11.942 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:11.942 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:12.200 [2024-07-15 20:39:04.420983] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:12.200 [2024-07-15 20:39:04.421278] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:12.459 [2024-07-15 20:39:04.673651] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:12.459 [2024-07-15 20:39:04.742377] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:12.718 [2024-07-15 20:39:04.851668] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:12.718 [2024-07-15 20:39:04.870012] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:12.718 [2024-07-15 20:39:04.870044] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:12.718 [2024-07-15 20:39:04.870055] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:12.718 [2024-07-15 20:39:04.892736] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1c0ec50 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.718 20:39:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.977 20:39:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:12.977 "name": "raid_bdev1", 00:25:12.977 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:12.977 "strip_size_kb": 0, 00:25:12.977 "state": "online", 00:25:12.977 "raid_level": "raid1", 00:25:12.977 "superblock": true, 00:25:12.977 "num_base_bdevs": 2, 00:25:12.977 "num_base_bdevs_discovered": 1, 00:25:12.977 "num_base_bdevs_operational": 1, 00:25:12.977 "base_bdevs_list": [ 00:25:12.977 { 00:25:12.977 "name": null, 00:25:12.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.977 "is_configured": false, 00:25:12.977 "data_offset": 2048, 00:25:12.977 "data_size": 63488 00:25:12.977 }, 00:25:12.977 { 00:25:12.977 "name": "BaseBdev2", 00:25:12.977 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:12.977 "is_configured": true, 00:25:12.977 "data_offset": 2048, 00:25:12.977 "data_size": 63488 00:25:12.977 } 00:25:12.977 ] 00:25:12.977 }' 00:25:12.977 20:39:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:12.977 20:39:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:13.544 20:39:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.544 20:39:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.544 20:39:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:13.544 20:39:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:13.544 20:39:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.544 20:39:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.544 20:39:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.801 20:39:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.802 "name": "raid_bdev1", 00:25:13.802 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:13.802 "strip_size_kb": 0, 00:25:13.802 "state": "online", 00:25:13.802 "raid_level": "raid1", 00:25:13.802 "superblock": true, 00:25:13.802 "num_base_bdevs": 2, 00:25:13.802 "num_base_bdevs_discovered": 1, 00:25:13.802 "num_base_bdevs_operational": 1, 00:25:13.802 "base_bdevs_list": [ 00:25:13.802 { 00:25:13.802 "name": null, 00:25:13.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.802 "is_configured": false, 00:25:13.802 "data_offset": 2048, 00:25:13.802 "data_size": 63488 00:25:13.802 }, 00:25:13.802 { 00:25:13.802 "name": "BaseBdev2", 00:25:13.802 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:13.802 "is_configured": true, 00:25:13.802 "data_offset": 2048, 00:25:13.802 "data_size": 63488 00:25:13.802 } 00:25:13.802 ] 00:25:13.802 }' 00:25:13.802 20:39:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.802 20:39:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:13.802 20:39:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.802 20:39:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:13.802 20:39:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:14.060 [2024-07-15 20:39:06.330829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:14.060 20:39:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:14.060 [2024-07-15 20:39:06.391034] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c0ee60 00:25:14.060 [2024-07-15 20:39:06.392537] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:14.318 [2024-07-15 20:39:06.527763] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:14.318 [2024-07-15 20:39:06.528204] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:14.576 [2024-07-15 20:39:06.764142] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:14.577 [2024-07-15 20:39:06.764410] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:14.835 [2024-07-15 20:39:07.121972] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:14.835 [2024-07-15 20:39:07.122353] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:15.093 [2024-07-15 20:39:07.349565] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:15.093 [2024-07-15 20:39:07.349742] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:15.093 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:15.093 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:15.093 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:15.093 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:15.093 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:15.093 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.093 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.351 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:15.351 "name": "raid_bdev1", 00:25:15.351 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:15.351 "strip_size_kb": 0, 00:25:15.351 "state": "online", 00:25:15.351 "raid_level": "raid1", 00:25:15.351 "superblock": true, 00:25:15.351 "num_base_bdevs": 2, 00:25:15.351 "num_base_bdevs_discovered": 2, 00:25:15.351 "num_base_bdevs_operational": 2, 00:25:15.351 "process": { 00:25:15.351 "type": "rebuild", 00:25:15.351 "target": "spare", 00:25:15.351 "progress": { 00:25:15.351 "blocks": 12288, 00:25:15.351 "percent": 19 00:25:15.351 } 00:25:15.351 }, 00:25:15.351 "base_bdevs_list": [ 00:25:15.351 { 00:25:15.351 "name": "spare", 00:25:15.351 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:15.351 "is_configured": true, 00:25:15.351 "data_offset": 2048, 00:25:15.351 "data_size": 63488 00:25:15.351 }, 00:25:15.351 { 00:25:15.351 "name": "BaseBdev2", 00:25:15.351 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:15.351 "is_configured": true, 00:25:15.351 "data_offset": 2048, 00:25:15.351 "data_size": 63488 00:25:15.351 } 00:25:15.351 ] 00:25:15.351 }' 00:25:15.351 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:15.351 [2024-07-15 20:39:07.691515] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:15.351 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:15.351 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:15.610 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:15.610 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:15.610 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:15.611 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=885 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.611 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.611 [2024-07-15 20:39:07.910991] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:15.869 20:39:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:15.869 "name": "raid_bdev1", 00:25:15.869 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:15.869 "strip_size_kb": 0, 00:25:15.869 "state": "online", 00:25:15.869 "raid_level": "raid1", 00:25:15.869 "superblock": true, 00:25:15.869 "num_base_bdevs": 2, 00:25:15.869 "num_base_bdevs_discovered": 2, 00:25:15.869 "num_base_bdevs_operational": 2, 00:25:15.869 "process": { 00:25:15.869 "type": "rebuild", 00:25:15.869 "target": "spare", 00:25:15.869 "progress": { 00:25:15.869 "blocks": 16384, 00:25:15.869 "percent": 25 00:25:15.869 } 00:25:15.869 }, 00:25:15.869 "base_bdevs_list": [ 00:25:15.869 { 00:25:15.869 "name": "spare", 00:25:15.869 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:15.869 "is_configured": true, 00:25:15.869 "data_offset": 2048, 00:25:15.869 "data_size": 63488 00:25:15.869 }, 00:25:15.869 { 00:25:15.870 "name": "BaseBdev2", 00:25:15.870 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:15.870 "is_configured": true, 00:25:15.870 "data_offset": 2048, 00:25:15.870 "data_size": 63488 00:25:15.870 } 00:25:15.870 ] 00:25:15.870 }' 00:25:15.870 20:39:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:15.870 20:39:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:15.870 20:39:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:15.870 20:39:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:15.870 20:39:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:16.128 [2024-07-15 20:39:08.264728] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:16.387 [2024-07-15 20:39:08.528386] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:16.645 [2024-07-15 20:39:08.893470] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:16.645 [2024-07-15 20:39:09.012834] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:16.904 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:16.904 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:16.904 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:16.904 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:16.904 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:16.904 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:16.904 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.904 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.904 [2024-07-15 20:39:09.277826] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:25:17.162 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:17.162 "name": "raid_bdev1", 00:25:17.162 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:17.162 "strip_size_kb": 0, 00:25:17.162 "state": "online", 00:25:17.162 "raid_level": "raid1", 00:25:17.162 "superblock": true, 00:25:17.162 "num_base_bdevs": 2, 00:25:17.162 "num_base_bdevs_discovered": 2, 00:25:17.162 "num_base_bdevs_operational": 2, 00:25:17.162 "process": { 00:25:17.162 "type": "rebuild", 00:25:17.162 "target": "spare", 00:25:17.162 "progress": { 00:25:17.162 "blocks": 32768, 00:25:17.162 "percent": 51 00:25:17.162 } 00:25:17.162 }, 00:25:17.162 "base_bdevs_list": [ 00:25:17.162 { 00:25:17.162 "name": "spare", 00:25:17.162 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:17.162 "is_configured": true, 00:25:17.162 "data_offset": 2048, 00:25:17.162 "data_size": 63488 00:25:17.162 }, 00:25:17.162 { 00:25:17.162 "name": "BaseBdev2", 00:25:17.162 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:17.162 "is_configured": true, 00:25:17.162 "data_offset": 2048, 00:25:17.162 "data_size": 63488 00:25:17.162 } 00:25:17.162 ] 00:25:17.162 }' 00:25:17.162 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:17.162 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:17.162 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:17.162 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:17.162 20:39:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:17.421 [2024-07-15 20:39:09.642532] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:18.358 "name": "raid_bdev1", 00:25:18.358 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:18.358 "strip_size_kb": 0, 00:25:18.358 "state": "online", 00:25:18.358 "raid_level": "raid1", 00:25:18.358 "superblock": true, 00:25:18.358 "num_base_bdevs": 2, 00:25:18.358 "num_base_bdevs_discovered": 2, 00:25:18.358 "num_base_bdevs_operational": 2, 00:25:18.358 "process": { 00:25:18.358 "type": "rebuild", 00:25:18.358 "target": "spare", 00:25:18.358 "progress": { 00:25:18.358 "blocks": 55296, 00:25:18.358 "percent": 87 00:25:18.358 } 00:25:18.358 }, 00:25:18.358 "base_bdevs_list": [ 00:25:18.358 { 00:25:18.358 "name": "spare", 00:25:18.358 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:18.358 "is_configured": true, 00:25:18.358 "data_offset": 2048, 00:25:18.358 "data_size": 63488 00:25:18.358 }, 00:25:18.358 { 00:25:18.358 "name": "BaseBdev2", 00:25:18.358 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:18.358 "is_configured": true, 00:25:18.358 "data_offset": 2048, 00:25:18.358 "data_size": 63488 00:25:18.358 } 00:25:18.358 ] 00:25:18.358 }' 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:18.358 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:18.690 [2024-07-15 20:39:10.753133] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:25:18.690 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:18.690 20:39:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:18.690 [2024-07-15 20:39:10.862618] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:25:18.948 [2024-07-15 20:39:11.201912] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:18.948 [2024-07-15 20:39:11.310159] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:18.948 [2024-07-15 20:39:11.311877] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:19.516 20:39:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:19.516 20:39:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:19.516 20:39:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:19.516 20:39:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:19.516 20:39:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:19.516 20:39:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:19.516 20:39:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.516 20:39:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:19.774 "name": "raid_bdev1", 00:25:19.774 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:19.774 "strip_size_kb": 0, 00:25:19.774 "state": "online", 00:25:19.774 "raid_level": "raid1", 00:25:19.774 "superblock": true, 00:25:19.774 "num_base_bdevs": 2, 00:25:19.774 "num_base_bdevs_discovered": 2, 00:25:19.774 "num_base_bdevs_operational": 2, 00:25:19.774 "base_bdevs_list": [ 00:25:19.774 { 00:25:19.774 "name": "spare", 00:25:19.774 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:19.774 "is_configured": true, 00:25:19.774 "data_offset": 2048, 00:25:19.774 "data_size": 63488 00:25:19.774 }, 00:25:19.774 { 00:25:19.774 "name": "BaseBdev2", 00:25:19.774 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:19.774 "is_configured": true, 00:25:19.774 "data_offset": 2048, 00:25:19.774 "data_size": 63488 00:25:19.774 } 00:25:19.774 ] 00:25:19.774 }' 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.774 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.032 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:20.032 "name": "raid_bdev1", 00:25:20.032 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:20.032 "strip_size_kb": 0, 00:25:20.032 "state": "online", 00:25:20.032 "raid_level": "raid1", 00:25:20.032 "superblock": true, 00:25:20.032 "num_base_bdevs": 2, 00:25:20.032 "num_base_bdevs_discovered": 2, 00:25:20.032 "num_base_bdevs_operational": 2, 00:25:20.032 "base_bdevs_list": [ 00:25:20.032 { 00:25:20.032 "name": "spare", 00:25:20.032 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:20.032 "is_configured": true, 00:25:20.032 "data_offset": 2048, 00:25:20.032 "data_size": 63488 00:25:20.032 }, 00:25:20.032 { 00:25:20.032 "name": "BaseBdev2", 00:25:20.032 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:20.032 "is_configured": true, 00:25:20.032 "data_offset": 2048, 00:25:20.032 "data_size": 63488 00:25:20.032 } 00:25:20.032 ] 00:25:20.032 }' 00:25:20.032 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.291 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.549 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:20.549 "name": "raid_bdev1", 00:25:20.549 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:20.549 "strip_size_kb": 0, 00:25:20.549 "state": "online", 00:25:20.549 "raid_level": "raid1", 00:25:20.549 "superblock": true, 00:25:20.549 "num_base_bdevs": 2, 00:25:20.549 "num_base_bdevs_discovered": 2, 00:25:20.549 "num_base_bdevs_operational": 2, 00:25:20.549 "base_bdevs_list": [ 00:25:20.549 { 00:25:20.549 "name": "spare", 00:25:20.549 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:20.549 "is_configured": true, 00:25:20.549 "data_offset": 2048, 00:25:20.549 "data_size": 63488 00:25:20.549 }, 00:25:20.549 { 00:25:20.549 "name": "BaseBdev2", 00:25:20.549 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:20.549 "is_configured": true, 00:25:20.549 "data_offset": 2048, 00:25:20.549 "data_size": 63488 00:25:20.549 } 00:25:20.549 ] 00:25:20.549 }' 00:25:20.549 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:20.549 20:39:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:21.114 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:21.371 [2024-07-15 20:39:13.562711] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:21.371 [2024-07-15 20:39:13.562747] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:21.371 00:25:21.371 Latency(us) 00:25:21.371 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:21.371 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:21.371 raid_bdev1 : 11.96 95.77 287.31 0.00 0.00 14303.20 288.50 119446.48 00:25:21.371 =================================================================================================================== 00:25:21.371 Total : 95.77 287.31 0.00 0.00 14303.20 288.50 119446.48 00:25:21.371 [2024-07-15 20:39:13.626900] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:21.371 [2024-07-15 20:39:13.626939] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:21.371 [2024-07-15 20:39:13.627014] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:21.371 [2024-07-15 20:39:13.627027] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c0e070 name raid_bdev1, state offline 00:25:21.371 0 00:25:21.371 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:21.371 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:21.630 20:39:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:21.889 /dev/nbd0 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:21.889 1+0 records in 00:25:21.889 1+0 records out 00:25:21.889 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290696 s, 14.1 MB/s 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:21.889 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:25:22.147 /dev/nbd1 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:22.148 1+0 records in 00:25:22.148 1+0 records out 00:25:22.148 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000273692 s, 15.0 MB/s 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:22.148 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:22.406 20:39:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:22.665 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:22.665 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:22.665 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:22.665 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:22.665 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:22.665 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:22.665 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:22.665 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:22.665 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:22.665 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:22.923 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:23.181 [2024-07-15 20:39:15.492271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:23.181 [2024-07-15 20:39:15.492321] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:23.181 [2024-07-15 20:39:15.492342] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a5d490 00:25:23.181 [2024-07-15 20:39:15.492355] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:23.181 [2024-07-15 20:39:15.494006] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:23.181 [2024-07-15 20:39:15.494035] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:23.181 [2024-07-15 20:39:15.494115] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:23.181 [2024-07-15 20:39:15.494143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:23.181 [2024-07-15 20:39:15.494243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:23.181 spare 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.181 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.438 [2024-07-15 20:39:15.594561] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a5cf70 00:25:23.438 [2024-07-15 20:39:15.594580] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:23.438 [2024-07-15 20:39:15.594777] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c0a670 00:25:23.438 [2024-07-15 20:39:15.594937] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a5cf70 00:25:23.438 [2024-07-15 20:39:15.594953] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a5cf70 00:25:23.438 [2024-07-15 20:39:15.595065] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:23.438 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:23.438 "name": "raid_bdev1", 00:25:23.438 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:23.438 "strip_size_kb": 0, 00:25:23.438 "state": "online", 00:25:23.438 "raid_level": "raid1", 00:25:23.438 "superblock": true, 00:25:23.438 "num_base_bdevs": 2, 00:25:23.438 "num_base_bdevs_discovered": 2, 00:25:23.438 "num_base_bdevs_operational": 2, 00:25:23.438 "base_bdevs_list": [ 00:25:23.438 { 00:25:23.438 "name": "spare", 00:25:23.438 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:23.438 "is_configured": true, 00:25:23.438 "data_offset": 2048, 00:25:23.438 "data_size": 63488 00:25:23.438 }, 00:25:23.438 { 00:25:23.438 "name": "BaseBdev2", 00:25:23.438 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:23.438 "is_configured": true, 00:25:23.438 "data_offset": 2048, 00:25:23.438 "data_size": 63488 00:25:23.438 } 00:25:23.438 ] 00:25:23.438 }' 00:25:23.438 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:23.438 20:39:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:24.002 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:24.002 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:24.002 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:24.002 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:24.002 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:24.002 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.002 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.259 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:24.259 "name": "raid_bdev1", 00:25:24.259 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:24.259 "strip_size_kb": 0, 00:25:24.259 "state": "online", 00:25:24.259 "raid_level": "raid1", 00:25:24.259 "superblock": true, 00:25:24.259 "num_base_bdevs": 2, 00:25:24.259 "num_base_bdevs_discovered": 2, 00:25:24.259 "num_base_bdevs_operational": 2, 00:25:24.259 "base_bdevs_list": [ 00:25:24.259 { 00:25:24.259 "name": "spare", 00:25:24.259 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:24.259 "is_configured": true, 00:25:24.259 "data_offset": 2048, 00:25:24.259 "data_size": 63488 00:25:24.259 }, 00:25:24.259 { 00:25:24.259 "name": "BaseBdev2", 00:25:24.259 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:24.259 "is_configured": true, 00:25:24.259 "data_offset": 2048, 00:25:24.259 "data_size": 63488 00:25:24.259 } 00:25:24.259 ] 00:25:24.259 }' 00:25:24.259 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:24.516 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:24.516 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:24.516 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:24.516 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.516 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:24.773 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:24.773 20:39:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:25.031 [2024-07-15 20:39:17.197287] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.031 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.288 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:25.288 "name": "raid_bdev1", 00:25:25.288 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:25.288 "strip_size_kb": 0, 00:25:25.288 "state": "online", 00:25:25.288 "raid_level": "raid1", 00:25:25.288 "superblock": true, 00:25:25.288 "num_base_bdevs": 2, 00:25:25.288 "num_base_bdevs_discovered": 1, 00:25:25.288 "num_base_bdevs_operational": 1, 00:25:25.288 "base_bdevs_list": [ 00:25:25.288 { 00:25:25.288 "name": null, 00:25:25.288 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:25.288 "is_configured": false, 00:25:25.288 "data_offset": 2048, 00:25:25.288 "data_size": 63488 00:25:25.288 }, 00:25:25.288 { 00:25:25.288 "name": "BaseBdev2", 00:25:25.288 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:25.288 "is_configured": true, 00:25:25.288 "data_offset": 2048, 00:25:25.288 "data_size": 63488 00:25:25.288 } 00:25:25.288 ] 00:25:25.288 }' 00:25:25.288 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:25.288 20:39:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:25.852 20:39:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:26.109 [2024-07-15 20:39:18.296530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:26.109 [2024-07-15 20:39:18.296689] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:26.109 [2024-07-15 20:39:18.296708] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:26.109 [2024-07-15 20:39:18.296737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:26.109 [2024-07-15 20:39:18.302041] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1764fa0 00:25:26.109 [2024-07-15 20:39:18.304382] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:26.109 20:39:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:27.044 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:27.044 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:27.044 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:27.044 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:27.044 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:27.044 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.044 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.302 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:27.302 "name": "raid_bdev1", 00:25:27.302 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:27.302 "strip_size_kb": 0, 00:25:27.302 "state": "online", 00:25:27.302 "raid_level": "raid1", 00:25:27.302 "superblock": true, 00:25:27.302 "num_base_bdevs": 2, 00:25:27.302 "num_base_bdevs_discovered": 2, 00:25:27.302 "num_base_bdevs_operational": 2, 00:25:27.302 "process": { 00:25:27.302 "type": "rebuild", 00:25:27.302 "target": "spare", 00:25:27.302 "progress": { 00:25:27.302 "blocks": 24576, 00:25:27.302 "percent": 38 00:25:27.302 } 00:25:27.302 }, 00:25:27.302 "base_bdevs_list": [ 00:25:27.302 { 00:25:27.302 "name": "spare", 00:25:27.302 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:27.302 "is_configured": true, 00:25:27.302 "data_offset": 2048, 00:25:27.302 "data_size": 63488 00:25:27.302 }, 00:25:27.302 { 00:25:27.302 "name": "BaseBdev2", 00:25:27.302 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:27.302 "is_configured": true, 00:25:27.302 "data_offset": 2048, 00:25:27.302 "data_size": 63488 00:25:27.302 } 00:25:27.302 ] 00:25:27.302 }' 00:25:27.302 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:27.302 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:27.302 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:27.302 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:27.302 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:27.561 [2024-07-15 20:39:19.831151] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:27.561 [2024-07-15 20:39:19.917231] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:27.561 [2024-07-15 20:39:19.917280] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:27.561 [2024-07-15 20:39:19.917296] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:27.561 [2024-07-15 20:39:19.917305] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.820 20:39:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.820 20:39:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:27.820 "name": "raid_bdev1", 00:25:27.820 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:27.820 "strip_size_kb": 0, 00:25:27.820 "state": "online", 00:25:27.820 "raid_level": "raid1", 00:25:27.820 "superblock": true, 00:25:27.820 "num_base_bdevs": 2, 00:25:27.820 "num_base_bdevs_discovered": 1, 00:25:27.820 "num_base_bdevs_operational": 1, 00:25:27.820 "base_bdevs_list": [ 00:25:27.820 { 00:25:27.820 "name": null, 00:25:27.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.820 "is_configured": false, 00:25:27.820 "data_offset": 2048, 00:25:27.820 "data_size": 63488 00:25:27.820 }, 00:25:27.820 { 00:25:27.820 "name": "BaseBdev2", 00:25:27.820 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:27.820 "is_configured": true, 00:25:27.820 "data_offset": 2048, 00:25:27.820 "data_size": 63488 00:25:27.820 } 00:25:27.820 ] 00:25:27.820 }' 00:25:27.820 20:39:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:27.820 20:39:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:28.386 20:39:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:28.645 [2024-07-15 20:39:20.892915] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:28.645 [2024-07-15 20:39:20.892975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:28.645 [2024-07-15 20:39:20.892999] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c0f140 00:25:28.645 [2024-07-15 20:39:20.893012] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:28.645 [2024-07-15 20:39:20.893401] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:28.645 [2024-07-15 20:39:20.893420] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:28.645 [2024-07-15 20:39:20.893504] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:28.645 [2024-07-15 20:39:20.893516] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:28.645 [2024-07-15 20:39:20.893528] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:28.645 [2024-07-15 20:39:20.893546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:28.645 [2024-07-15 20:39:20.898845] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c0cbe0 00:25:28.645 spare 00:25:28.645 [2024-07-15 20:39:20.900311] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:28.645 20:39:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:29.580 20:39:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:29.580 20:39:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.580 20:39:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:29.580 20:39:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:29.580 20:39:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.580 20:39:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.580 20:39:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.839 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.839 "name": "raid_bdev1", 00:25:29.839 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:29.839 "strip_size_kb": 0, 00:25:29.839 "state": "online", 00:25:29.839 "raid_level": "raid1", 00:25:29.839 "superblock": true, 00:25:29.839 "num_base_bdevs": 2, 00:25:29.839 "num_base_bdevs_discovered": 2, 00:25:29.839 "num_base_bdevs_operational": 2, 00:25:29.839 "process": { 00:25:29.839 "type": "rebuild", 00:25:29.839 "target": "spare", 00:25:29.839 "progress": { 00:25:29.839 "blocks": 22528, 00:25:29.839 "percent": 35 00:25:29.839 } 00:25:29.839 }, 00:25:29.839 "base_bdevs_list": [ 00:25:29.839 { 00:25:29.839 "name": "spare", 00:25:29.839 "uuid": "5102b499-f668-5cb1-8f0e-4ea73208ab67", 00:25:29.839 "is_configured": true, 00:25:29.839 "data_offset": 2048, 00:25:29.839 "data_size": 63488 00:25:29.839 }, 00:25:29.839 { 00:25:29.839 "name": "BaseBdev2", 00:25:29.839 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:29.839 "is_configured": true, 00:25:29.839 "data_offset": 2048, 00:25:29.839 "data_size": 63488 00:25:29.839 } 00:25:29.839 ] 00:25:29.839 }' 00:25:29.839 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.839 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:29.839 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.839 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:29.839 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:30.098 [2024-07-15 20:39:22.423409] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:30.357 [2024-07-15 20:39:22.512963] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:30.357 [2024-07-15 20:39:22.513017] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:30.357 [2024-07-15 20:39:22.513032] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:30.357 [2024-07-15 20:39:22.513041] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.357 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.616 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:30.616 "name": "raid_bdev1", 00:25:30.616 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:30.616 "strip_size_kb": 0, 00:25:30.616 "state": "online", 00:25:30.616 "raid_level": "raid1", 00:25:30.616 "superblock": true, 00:25:30.616 "num_base_bdevs": 2, 00:25:30.616 "num_base_bdevs_discovered": 1, 00:25:30.616 "num_base_bdevs_operational": 1, 00:25:30.616 "base_bdevs_list": [ 00:25:30.616 { 00:25:30.616 "name": null, 00:25:30.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.616 "is_configured": false, 00:25:30.616 "data_offset": 2048, 00:25:30.616 "data_size": 63488 00:25:30.616 }, 00:25:30.616 { 00:25:30.616 "name": "BaseBdev2", 00:25:30.616 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:30.616 "is_configured": true, 00:25:30.616 "data_offset": 2048, 00:25:30.616 "data_size": 63488 00:25:30.616 } 00:25:30.616 ] 00:25:30.616 }' 00:25:30.616 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:30.616 20:39:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:31.183 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:31.183 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.183 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:31.183 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:31.183 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.183 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.183 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.440 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:31.440 "name": "raid_bdev1", 00:25:31.440 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:31.440 "strip_size_kb": 0, 00:25:31.440 "state": "online", 00:25:31.440 "raid_level": "raid1", 00:25:31.440 "superblock": true, 00:25:31.440 "num_base_bdevs": 2, 00:25:31.440 "num_base_bdevs_discovered": 1, 00:25:31.440 "num_base_bdevs_operational": 1, 00:25:31.440 "base_bdevs_list": [ 00:25:31.440 { 00:25:31.441 "name": null, 00:25:31.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:31.441 "is_configured": false, 00:25:31.441 "data_offset": 2048, 00:25:31.441 "data_size": 63488 00:25:31.441 }, 00:25:31.441 { 00:25:31.441 "name": "BaseBdev2", 00:25:31.441 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:31.441 "is_configured": true, 00:25:31.441 "data_offset": 2048, 00:25:31.441 "data_size": 63488 00:25:31.441 } 00:25:31.441 ] 00:25:31.441 }' 00:25:31.441 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:31.441 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:31.441 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:31.441 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:31.441 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:31.699 20:39:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:31.958 [2024-07-15 20:39:24.210401] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:31.958 [2024-07-15 20:39:24.210450] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:31.958 [2024-07-15 20:39:24.210471] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c0bda0 00:25:31.958 [2024-07-15 20:39:24.210484] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:31.958 [2024-07-15 20:39:24.210841] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:31.958 [2024-07-15 20:39:24.210858] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:31.958 [2024-07-15 20:39:24.210924] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:31.958 [2024-07-15 20:39:24.210949] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:31.958 [2024-07-15 20:39:24.210960] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:31.958 BaseBdev1 00:25:31.958 20:39:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.893 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.151 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:33.151 "name": "raid_bdev1", 00:25:33.151 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:33.151 "strip_size_kb": 0, 00:25:33.151 "state": "online", 00:25:33.151 "raid_level": "raid1", 00:25:33.151 "superblock": true, 00:25:33.151 "num_base_bdevs": 2, 00:25:33.151 "num_base_bdevs_discovered": 1, 00:25:33.151 "num_base_bdevs_operational": 1, 00:25:33.151 "base_bdevs_list": [ 00:25:33.151 { 00:25:33.151 "name": null, 00:25:33.151 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.151 "is_configured": false, 00:25:33.151 "data_offset": 2048, 00:25:33.151 "data_size": 63488 00:25:33.151 }, 00:25:33.151 { 00:25:33.151 "name": "BaseBdev2", 00:25:33.151 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:33.151 "is_configured": true, 00:25:33.151 "data_offset": 2048, 00:25:33.151 "data_size": 63488 00:25:33.151 } 00:25:33.151 ] 00:25:33.151 }' 00:25:33.151 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:33.151 20:39:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:34.088 "name": "raid_bdev1", 00:25:34.088 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:34.088 "strip_size_kb": 0, 00:25:34.088 "state": "online", 00:25:34.088 "raid_level": "raid1", 00:25:34.088 "superblock": true, 00:25:34.088 "num_base_bdevs": 2, 00:25:34.088 "num_base_bdevs_discovered": 1, 00:25:34.088 "num_base_bdevs_operational": 1, 00:25:34.088 "base_bdevs_list": [ 00:25:34.088 { 00:25:34.088 "name": null, 00:25:34.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.088 "is_configured": false, 00:25:34.088 "data_offset": 2048, 00:25:34.088 "data_size": 63488 00:25:34.088 }, 00:25:34.088 { 00:25:34.088 "name": "BaseBdev2", 00:25:34.088 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:34.088 "is_configured": true, 00:25:34.088 "data_offset": 2048, 00:25:34.088 "data_size": 63488 00:25:34.088 } 00:25:34.088 ] 00:25:34.088 }' 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:34.088 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:34.346 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:34.605 [2024-07-15 20:39:26.962098] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:34.605 [2024-07-15 20:39:26.962238] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:34.605 [2024-07-15 20:39:26.962260] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:34.605 request: 00:25:34.605 { 00:25:34.605 "base_bdev": "BaseBdev1", 00:25:34.605 "raid_bdev": "raid_bdev1", 00:25:34.605 "method": "bdev_raid_add_base_bdev", 00:25:34.605 "req_id": 1 00:25:34.605 } 00:25:34.605 Got JSON-RPC error response 00:25:34.605 response: 00:25:34.605 { 00:25:34.605 "code": -22, 00:25:34.605 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:34.605 } 00:25:34.868 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:25:34.868 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:34.868 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:34.868 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:34.868 20:39:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:35.802 20:39:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:35.802 20:39:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:35.802 20:39:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:35.802 20:39:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:35.802 20:39:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:35.802 20:39:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:35.802 20:39:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:35.802 20:39:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:35.802 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:35.802 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:35.802 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.802 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.061 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.061 "name": "raid_bdev1", 00:25:36.061 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:36.061 "strip_size_kb": 0, 00:25:36.061 "state": "online", 00:25:36.061 "raid_level": "raid1", 00:25:36.061 "superblock": true, 00:25:36.061 "num_base_bdevs": 2, 00:25:36.061 "num_base_bdevs_discovered": 1, 00:25:36.061 "num_base_bdevs_operational": 1, 00:25:36.061 "base_bdevs_list": [ 00:25:36.061 { 00:25:36.061 "name": null, 00:25:36.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.061 "is_configured": false, 00:25:36.061 "data_offset": 2048, 00:25:36.061 "data_size": 63488 00:25:36.061 }, 00:25:36.061 { 00:25:36.061 "name": "BaseBdev2", 00:25:36.061 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:36.061 "is_configured": true, 00:25:36.061 "data_offset": 2048, 00:25:36.061 "data_size": 63488 00:25:36.061 } 00:25:36.061 ] 00:25:36.061 }' 00:25:36.061 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.061 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:36.694 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:36.694 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:36.694 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:36.694 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:36.694 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:36.694 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.694 20:39:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:36.953 "name": "raid_bdev1", 00:25:36.953 "uuid": "513de924-6959-42a2-bb2d-1b138dd1a5d5", 00:25:36.953 "strip_size_kb": 0, 00:25:36.953 "state": "online", 00:25:36.953 "raid_level": "raid1", 00:25:36.953 "superblock": true, 00:25:36.953 "num_base_bdevs": 2, 00:25:36.953 "num_base_bdevs_discovered": 1, 00:25:36.953 "num_base_bdevs_operational": 1, 00:25:36.953 "base_bdevs_list": [ 00:25:36.953 { 00:25:36.953 "name": null, 00:25:36.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.953 "is_configured": false, 00:25:36.953 "data_offset": 2048, 00:25:36.953 "data_size": 63488 00:25:36.953 }, 00:25:36.953 { 00:25:36.953 "name": "BaseBdev2", 00:25:36.953 "uuid": "dbd6c153-6d34-54f4-97ef-b25f7da02d42", 00:25:36.953 "is_configured": true, 00:25:36.953 "data_offset": 2048, 00:25:36.953 "data_size": 63488 00:25:36.953 } 00:25:36.953 ] 00:25:36.953 }' 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1471578 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 1471578 ']' 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 1471578 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1471578 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1471578' 00:25:36.953 killing process with pid 1471578 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 1471578 00:25:36.953 Received shutdown signal, test time was about 27.609272 seconds 00:25:36.953 00:25:36.953 Latency(us) 00:25:36.953 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:36.953 =================================================================================================================== 00:25:36.953 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:36.953 [2024-07-15 20:39:29.314469] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:36.953 [2024-07-15 20:39:29.314567] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:36.953 [2024-07-15 20:39:29.314616] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:36.953 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 1471578 00:25:36.953 [2024-07-15 20:39:29.314630] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a5cf70 name raid_bdev1, state offline 00:25:37.212 [2024-07-15 20:39:29.335852] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:37.212 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:37.212 00:25:37.212 real 0m32.343s 00:25:37.212 user 0m50.524s 00:25:37.212 sys 0m4.712s 00:25:37.212 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:37.212 20:39:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:37.212 ************************************ 00:25:37.212 END TEST raid_rebuild_test_sb_io 00:25:37.212 ************************************ 00:25:37.471 20:39:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:37.471 20:39:29 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:25:37.471 20:39:29 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:25:37.471 20:39:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:37.471 20:39:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:37.471 20:39:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:37.471 ************************************ 00:25:37.471 START TEST raid_rebuild_test 00:25:37.471 ************************************ 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1476630 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1476630 /var/tmp/spdk-raid.sock 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 1476630 ']' 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:37.471 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:37.471 20:39:29 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:37.471 [2024-07-15 20:39:29.709539] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:25:37.471 [2024-07-15 20:39:29.709609] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1476630 ] 00:25:37.471 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:37.471 Zero copy mechanism will not be used. 00:25:37.731 [2024-07-15 20:39:29.850614] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:37.731 [2024-07-15 20:39:29.987287] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:37.731 [2024-07-15 20:39:30.060046] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:37.731 [2024-07-15 20:39:30.060083] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:38.668 20:39:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:38.668 20:39:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:25:38.668 20:39:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:38.668 20:39:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:38.668 BaseBdev1_malloc 00:25:38.668 20:39:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:38.928 [2024-07-15 20:39:31.219271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:38.928 [2024-07-15 20:39:31.219323] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:38.928 [2024-07-15 20:39:31.219348] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ed8d40 00:25:38.928 [2024-07-15 20:39:31.219361] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:38.928 [2024-07-15 20:39:31.221049] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:38.928 [2024-07-15 20:39:31.221079] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:38.928 BaseBdev1 00:25:38.928 20:39:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:38.928 20:39:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:39.186 BaseBdev2_malloc 00:25:39.187 20:39:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:39.445 [2024-07-15 20:39:31.665484] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:39.445 [2024-07-15 20:39:31.665533] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:39.445 [2024-07-15 20:39:31.665557] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ed9860 00:25:39.445 [2024-07-15 20:39:31.665570] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:39.445 [2024-07-15 20:39:31.667096] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:39.445 [2024-07-15 20:39:31.667125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:39.445 BaseBdev2 00:25:39.445 20:39:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:39.445 20:39:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:39.703 BaseBdev3_malloc 00:25:39.703 20:39:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:39.961 [2024-07-15 20:39:32.091265] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:39.961 [2024-07-15 20:39:32.091315] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:39.961 [2024-07-15 20:39:32.091342] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20868f0 00:25:39.961 [2024-07-15 20:39:32.091355] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:39.961 [2024-07-15 20:39:32.092815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:39.961 [2024-07-15 20:39:32.092845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:39.961 BaseBdev3 00:25:39.961 20:39:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:39.961 20:39:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:40.219 BaseBdev4_malloc 00:25:40.219 20:39:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:40.219 [2024-07-15 20:39:32.593196] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:40.219 [2024-07-15 20:39:32.593244] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:40.219 [2024-07-15 20:39:32.593265] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2085ad0 00:25:40.219 [2024-07-15 20:39:32.593277] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:40.219 [2024-07-15 20:39:32.594694] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:40.219 [2024-07-15 20:39:32.594723] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:40.476 BaseBdev4 00:25:40.476 20:39:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:40.476 spare_malloc 00:25:40.735 20:39:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:40.735 spare_delay 00:25:40.994 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:40.994 [2024-07-15 20:39:33.343828] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:40.994 [2024-07-15 20:39:33.343873] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:40.994 [2024-07-15 20:39:33.343892] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x208a5b0 00:25:40.994 [2024-07-15 20:39:33.343905] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:40.994 [2024-07-15 20:39:33.345356] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:40.994 [2024-07-15 20:39:33.345382] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:40.994 spare 00:25:40.994 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:41.596 [2024-07-15 20:39:33.849182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:41.596 [2024-07-15 20:39:33.850540] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:41.596 [2024-07-15 20:39:33.850597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:41.596 [2024-07-15 20:39:33.850642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:41.596 [2024-07-15 20:39:33.850728] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20098a0 00:25:41.596 [2024-07-15 20:39:33.850738] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:41.596 [2024-07-15 20:39:33.850975] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2083e10 00:25:41.596 [2024-07-15 20:39:33.851130] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20098a0 00:25:41.596 [2024-07-15 20:39:33.851147] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20098a0 00:25:41.596 [2024-07-15 20:39:33.851270] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.596 20:39:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.856 20:39:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:41.856 "name": "raid_bdev1", 00:25:41.856 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:25:41.856 "strip_size_kb": 0, 00:25:41.856 "state": "online", 00:25:41.856 "raid_level": "raid1", 00:25:41.856 "superblock": false, 00:25:41.856 "num_base_bdevs": 4, 00:25:41.856 "num_base_bdevs_discovered": 4, 00:25:41.856 "num_base_bdevs_operational": 4, 00:25:41.856 "base_bdevs_list": [ 00:25:41.856 { 00:25:41.856 "name": "BaseBdev1", 00:25:41.856 "uuid": "58212623-fd31-5701-ba63-a24b422e9c1d", 00:25:41.856 "is_configured": true, 00:25:41.856 "data_offset": 0, 00:25:41.856 "data_size": 65536 00:25:41.856 }, 00:25:41.856 { 00:25:41.856 "name": "BaseBdev2", 00:25:41.856 "uuid": "40f0eb3c-da39-5fbe-b49d-129c99e5800e", 00:25:41.856 "is_configured": true, 00:25:41.856 "data_offset": 0, 00:25:41.856 "data_size": 65536 00:25:41.856 }, 00:25:41.856 { 00:25:41.856 "name": "BaseBdev3", 00:25:41.856 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:25:41.856 "is_configured": true, 00:25:41.856 "data_offset": 0, 00:25:41.856 "data_size": 65536 00:25:41.856 }, 00:25:41.856 { 00:25:41.856 "name": "BaseBdev4", 00:25:41.856 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:25:41.856 "is_configured": true, 00:25:41.856 "data_offset": 0, 00:25:41.856 "data_size": 65536 00:25:41.856 } 00:25:41.856 ] 00:25:41.856 }' 00:25:41.856 20:39:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:41.856 20:39:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:42.421 20:39:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:42.421 20:39:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:42.678 [2024-07-15 20:39:34.964417] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:42.678 20:39:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:42.678 20:39:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.678 20:39:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:42.937 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:43.196 [2024-07-15 20:39:35.469502] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2083e10 00:25:43.196 /dev/nbd0 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:43.196 1+0 records in 00:25:43.196 1+0 records out 00:25:43.196 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259886 s, 15.8 MB/s 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:43.196 20:39:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:25:53.170 65536+0 records in 00:25:53.170 65536+0 records out 00:25:53.170 33554432 bytes (34 MB, 32 MiB) copied, 8.21885 s, 4.1 MB/s 00:25:53.170 20:39:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:53.170 20:39:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:53.170 20:39:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:53.170 20:39:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:53.170 20:39:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:53.170 20:39:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:53.170 20:39:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:53.170 [2024-07-15 20:39:44.029947] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:53.170 [2024-07-15 20:39:44.278637] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.170 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:53.171 "name": "raid_bdev1", 00:25:53.171 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:25:53.171 "strip_size_kb": 0, 00:25:53.171 "state": "online", 00:25:53.171 "raid_level": "raid1", 00:25:53.171 "superblock": false, 00:25:53.171 "num_base_bdevs": 4, 00:25:53.171 "num_base_bdevs_discovered": 3, 00:25:53.171 "num_base_bdevs_operational": 3, 00:25:53.171 "base_bdevs_list": [ 00:25:53.171 { 00:25:53.171 "name": null, 00:25:53.171 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.171 "is_configured": false, 00:25:53.171 "data_offset": 0, 00:25:53.171 "data_size": 65536 00:25:53.171 }, 00:25:53.171 { 00:25:53.171 "name": "BaseBdev2", 00:25:53.171 "uuid": "40f0eb3c-da39-5fbe-b49d-129c99e5800e", 00:25:53.171 "is_configured": true, 00:25:53.171 "data_offset": 0, 00:25:53.171 "data_size": 65536 00:25:53.171 }, 00:25:53.171 { 00:25:53.171 "name": "BaseBdev3", 00:25:53.171 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:25:53.171 "is_configured": true, 00:25:53.171 "data_offset": 0, 00:25:53.171 "data_size": 65536 00:25:53.171 }, 00:25:53.171 { 00:25:53.171 "name": "BaseBdev4", 00:25:53.171 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:25:53.171 "is_configured": true, 00:25:53.171 "data_offset": 0, 00:25:53.171 "data_size": 65536 00:25:53.171 } 00:25:53.171 ] 00:25:53.171 }' 00:25:53.171 20:39:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:53.171 20:39:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:53.171 20:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:53.171 [2024-07-15 20:39:45.357507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:53.171 [2024-07-15 20:39:45.361662] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x200f6b0 00:25:53.171 [2024-07-15 20:39:45.364042] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:53.171 20:39:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:54.106 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:54.106 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:54.106 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:54.106 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:54.106 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:54.106 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.106 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.365 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:54.365 "name": "raid_bdev1", 00:25:54.365 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:25:54.365 "strip_size_kb": 0, 00:25:54.365 "state": "online", 00:25:54.365 "raid_level": "raid1", 00:25:54.365 "superblock": false, 00:25:54.365 "num_base_bdevs": 4, 00:25:54.365 "num_base_bdevs_discovered": 4, 00:25:54.365 "num_base_bdevs_operational": 4, 00:25:54.365 "process": { 00:25:54.365 "type": "rebuild", 00:25:54.365 "target": "spare", 00:25:54.365 "progress": { 00:25:54.365 "blocks": 22528, 00:25:54.365 "percent": 34 00:25:54.365 } 00:25:54.365 }, 00:25:54.365 "base_bdevs_list": [ 00:25:54.365 { 00:25:54.365 "name": "spare", 00:25:54.365 "uuid": "deb239ae-f5e7-5f6e-9156-2da77fa1827f", 00:25:54.365 "is_configured": true, 00:25:54.365 "data_offset": 0, 00:25:54.365 "data_size": 65536 00:25:54.365 }, 00:25:54.365 { 00:25:54.365 "name": "BaseBdev2", 00:25:54.365 "uuid": "40f0eb3c-da39-5fbe-b49d-129c99e5800e", 00:25:54.365 "is_configured": true, 00:25:54.365 "data_offset": 0, 00:25:54.365 "data_size": 65536 00:25:54.365 }, 00:25:54.365 { 00:25:54.365 "name": "BaseBdev3", 00:25:54.365 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:25:54.365 "is_configured": true, 00:25:54.365 "data_offset": 0, 00:25:54.365 "data_size": 65536 00:25:54.365 }, 00:25:54.365 { 00:25:54.365 "name": "BaseBdev4", 00:25:54.365 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:25:54.365 "is_configured": true, 00:25:54.365 "data_offset": 0, 00:25:54.365 "data_size": 65536 00:25:54.365 } 00:25:54.365 ] 00:25:54.365 }' 00:25:54.365 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:54.365 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:54.365 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:54.365 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:54.365 20:39:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:54.623 [2024-07-15 20:39:46.911602] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:54.623 [2024-07-15 20:39:46.976828] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:54.623 [2024-07-15 20:39:46.976874] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:54.623 [2024-07-15 20:39:46.976891] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:54.623 [2024-07-15 20:39:46.976901] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.882 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.140 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.140 "name": "raid_bdev1", 00:25:55.140 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:25:55.140 "strip_size_kb": 0, 00:25:55.140 "state": "online", 00:25:55.140 "raid_level": "raid1", 00:25:55.140 "superblock": false, 00:25:55.140 "num_base_bdevs": 4, 00:25:55.140 "num_base_bdevs_discovered": 3, 00:25:55.140 "num_base_bdevs_operational": 3, 00:25:55.140 "base_bdevs_list": [ 00:25:55.140 { 00:25:55.140 "name": null, 00:25:55.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.140 "is_configured": false, 00:25:55.140 "data_offset": 0, 00:25:55.140 "data_size": 65536 00:25:55.140 }, 00:25:55.140 { 00:25:55.140 "name": "BaseBdev2", 00:25:55.140 "uuid": "40f0eb3c-da39-5fbe-b49d-129c99e5800e", 00:25:55.140 "is_configured": true, 00:25:55.140 "data_offset": 0, 00:25:55.140 "data_size": 65536 00:25:55.140 }, 00:25:55.140 { 00:25:55.140 "name": "BaseBdev3", 00:25:55.140 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:25:55.140 "is_configured": true, 00:25:55.140 "data_offset": 0, 00:25:55.140 "data_size": 65536 00:25:55.140 }, 00:25:55.140 { 00:25:55.140 "name": "BaseBdev4", 00:25:55.140 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:25:55.140 "is_configured": true, 00:25:55.140 "data_offset": 0, 00:25:55.140 "data_size": 65536 00:25:55.140 } 00:25:55.140 ] 00:25:55.140 }' 00:25:55.140 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.140 20:39:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:55.706 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:55.706 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:55.706 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:55.706 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:55.706 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:55.706 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.706 20:39:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.964 20:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:55.964 "name": "raid_bdev1", 00:25:55.964 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:25:55.964 "strip_size_kb": 0, 00:25:55.964 "state": "online", 00:25:55.964 "raid_level": "raid1", 00:25:55.964 "superblock": false, 00:25:55.964 "num_base_bdevs": 4, 00:25:55.964 "num_base_bdevs_discovered": 3, 00:25:55.964 "num_base_bdevs_operational": 3, 00:25:55.964 "base_bdevs_list": [ 00:25:55.964 { 00:25:55.964 "name": null, 00:25:55.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.964 "is_configured": false, 00:25:55.964 "data_offset": 0, 00:25:55.964 "data_size": 65536 00:25:55.964 }, 00:25:55.964 { 00:25:55.964 "name": "BaseBdev2", 00:25:55.964 "uuid": "40f0eb3c-da39-5fbe-b49d-129c99e5800e", 00:25:55.964 "is_configured": true, 00:25:55.964 "data_offset": 0, 00:25:55.964 "data_size": 65536 00:25:55.964 }, 00:25:55.964 { 00:25:55.964 "name": "BaseBdev3", 00:25:55.964 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:25:55.964 "is_configured": true, 00:25:55.964 "data_offset": 0, 00:25:55.964 "data_size": 65536 00:25:55.964 }, 00:25:55.964 { 00:25:55.964 "name": "BaseBdev4", 00:25:55.964 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:25:55.964 "is_configured": true, 00:25:55.964 "data_offset": 0, 00:25:55.964 "data_size": 65536 00:25:55.964 } 00:25:55.964 ] 00:25:55.964 }' 00:25:55.964 20:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:55.964 20:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:55.964 20:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:55.964 20:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:55.965 20:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:56.223 [2024-07-15 20:39:48.425375] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:56.223 [2024-07-15 20:39:48.430043] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x200f6b0 00:25:56.223 [2024-07-15 20:39:48.431604] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:56.223 20:39:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:57.160 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:57.160 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:57.160 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:57.160 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:57.160 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:57.160 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.160 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.417 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:57.417 "name": "raid_bdev1", 00:25:57.417 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:25:57.417 "strip_size_kb": 0, 00:25:57.417 "state": "online", 00:25:57.417 "raid_level": "raid1", 00:25:57.417 "superblock": false, 00:25:57.417 "num_base_bdevs": 4, 00:25:57.417 "num_base_bdevs_discovered": 4, 00:25:57.417 "num_base_bdevs_operational": 4, 00:25:57.417 "process": { 00:25:57.417 "type": "rebuild", 00:25:57.417 "target": "spare", 00:25:57.417 "progress": { 00:25:57.417 "blocks": 24576, 00:25:57.417 "percent": 37 00:25:57.417 } 00:25:57.417 }, 00:25:57.417 "base_bdevs_list": [ 00:25:57.417 { 00:25:57.417 "name": "spare", 00:25:57.418 "uuid": "deb239ae-f5e7-5f6e-9156-2da77fa1827f", 00:25:57.418 "is_configured": true, 00:25:57.418 "data_offset": 0, 00:25:57.418 "data_size": 65536 00:25:57.418 }, 00:25:57.418 { 00:25:57.418 "name": "BaseBdev2", 00:25:57.418 "uuid": "40f0eb3c-da39-5fbe-b49d-129c99e5800e", 00:25:57.418 "is_configured": true, 00:25:57.418 "data_offset": 0, 00:25:57.418 "data_size": 65536 00:25:57.418 }, 00:25:57.418 { 00:25:57.418 "name": "BaseBdev3", 00:25:57.418 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:25:57.418 "is_configured": true, 00:25:57.418 "data_offset": 0, 00:25:57.418 "data_size": 65536 00:25:57.418 }, 00:25:57.418 { 00:25:57.418 "name": "BaseBdev4", 00:25:57.418 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:25:57.418 "is_configured": true, 00:25:57.418 "data_offset": 0, 00:25:57.418 "data_size": 65536 00:25:57.418 } 00:25:57.418 ] 00:25:57.418 }' 00:25:57.418 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:57.418 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:57.418 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:57.418 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:57.418 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:57.418 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:57.418 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:57.418 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:57.418 20:39:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:57.675 [2024-07-15 20:39:50.024669] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:57.675 [2024-07-15 20:39:50.044668] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x200f6b0 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:57.933 "name": "raid_bdev1", 00:25:57.933 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:25:57.933 "strip_size_kb": 0, 00:25:57.933 "state": "online", 00:25:57.933 "raid_level": "raid1", 00:25:57.933 "superblock": false, 00:25:57.933 "num_base_bdevs": 4, 00:25:57.933 "num_base_bdevs_discovered": 3, 00:25:57.933 "num_base_bdevs_operational": 3, 00:25:57.933 "process": { 00:25:57.933 "type": "rebuild", 00:25:57.933 "target": "spare", 00:25:57.933 "progress": { 00:25:57.933 "blocks": 34816, 00:25:57.933 "percent": 53 00:25:57.933 } 00:25:57.933 }, 00:25:57.933 "base_bdevs_list": [ 00:25:57.933 { 00:25:57.933 "name": "spare", 00:25:57.933 "uuid": "deb239ae-f5e7-5f6e-9156-2da77fa1827f", 00:25:57.933 "is_configured": true, 00:25:57.933 "data_offset": 0, 00:25:57.933 "data_size": 65536 00:25:57.933 }, 00:25:57.933 { 00:25:57.933 "name": null, 00:25:57.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:57.933 "is_configured": false, 00:25:57.933 "data_offset": 0, 00:25:57.933 "data_size": 65536 00:25:57.933 }, 00:25:57.933 { 00:25:57.933 "name": "BaseBdev3", 00:25:57.933 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:25:57.933 "is_configured": true, 00:25:57.933 "data_offset": 0, 00:25:57.933 "data_size": 65536 00:25:57.933 }, 00:25:57.933 { 00:25:57.933 "name": "BaseBdev4", 00:25:57.933 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:25:57.933 "is_configured": true, 00:25:57.933 "data_offset": 0, 00:25:57.933 "data_size": 65536 00:25:57.933 } 00:25:57.933 ] 00:25:57.933 }' 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:57.933 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.192 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:58.192 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=928 00:25:58.192 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:58.192 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:58.192 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:58.192 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:58.192 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:58.192 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:58.192 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.192 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.451 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:58.451 "name": "raid_bdev1", 00:25:58.451 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:25:58.451 "strip_size_kb": 0, 00:25:58.451 "state": "online", 00:25:58.451 "raid_level": "raid1", 00:25:58.451 "superblock": false, 00:25:58.451 "num_base_bdevs": 4, 00:25:58.451 "num_base_bdevs_discovered": 3, 00:25:58.451 "num_base_bdevs_operational": 3, 00:25:58.451 "process": { 00:25:58.451 "type": "rebuild", 00:25:58.451 "target": "spare", 00:25:58.451 "progress": { 00:25:58.451 "blocks": 43008, 00:25:58.451 "percent": 65 00:25:58.451 } 00:25:58.451 }, 00:25:58.451 "base_bdevs_list": [ 00:25:58.451 { 00:25:58.451 "name": "spare", 00:25:58.451 "uuid": "deb239ae-f5e7-5f6e-9156-2da77fa1827f", 00:25:58.451 "is_configured": true, 00:25:58.451 "data_offset": 0, 00:25:58.451 "data_size": 65536 00:25:58.451 }, 00:25:58.451 { 00:25:58.451 "name": null, 00:25:58.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.451 "is_configured": false, 00:25:58.451 "data_offset": 0, 00:25:58.451 "data_size": 65536 00:25:58.451 }, 00:25:58.451 { 00:25:58.451 "name": "BaseBdev3", 00:25:58.451 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:25:58.451 "is_configured": true, 00:25:58.451 "data_offset": 0, 00:25:58.451 "data_size": 65536 00:25:58.451 }, 00:25:58.451 { 00:25:58.451 "name": "BaseBdev4", 00:25:58.451 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:25:58.451 "is_configured": true, 00:25:58.451 "data_offset": 0, 00:25:58.451 "data_size": 65536 00:25:58.451 } 00:25:58.451 ] 00:25:58.451 }' 00:25:58.451 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:58.451 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:58.451 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.451 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:58.451 20:39:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:59.386 [2024-07-15 20:39:51.657085] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:59.386 [2024-07-15 20:39:51.657148] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:59.386 [2024-07-15 20:39:51.657186] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:59.386 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:59.386 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:59.386 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:59.386 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:59.386 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:59.386 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:59.386 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.386 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.644 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:59.644 "name": "raid_bdev1", 00:25:59.644 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:25:59.644 "strip_size_kb": 0, 00:25:59.644 "state": "online", 00:25:59.644 "raid_level": "raid1", 00:25:59.644 "superblock": false, 00:25:59.644 "num_base_bdevs": 4, 00:25:59.644 "num_base_bdevs_discovered": 3, 00:25:59.644 "num_base_bdevs_operational": 3, 00:25:59.644 "base_bdevs_list": [ 00:25:59.644 { 00:25:59.644 "name": "spare", 00:25:59.644 "uuid": "deb239ae-f5e7-5f6e-9156-2da77fa1827f", 00:25:59.644 "is_configured": true, 00:25:59.644 "data_offset": 0, 00:25:59.644 "data_size": 65536 00:25:59.644 }, 00:25:59.644 { 00:25:59.644 "name": null, 00:25:59.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.644 "is_configured": false, 00:25:59.644 "data_offset": 0, 00:25:59.644 "data_size": 65536 00:25:59.644 }, 00:25:59.644 { 00:25:59.644 "name": "BaseBdev3", 00:25:59.644 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:25:59.644 "is_configured": true, 00:25:59.644 "data_offset": 0, 00:25:59.644 "data_size": 65536 00:25:59.644 }, 00:25:59.644 { 00:25:59.644 "name": "BaseBdev4", 00:25:59.644 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:25:59.644 "is_configured": true, 00:25:59.644 "data_offset": 0, 00:25:59.644 "data_size": 65536 00:25:59.644 } 00:25:59.644 ] 00:25:59.644 }' 00:25:59.644 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:59.644 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:59.644 20:39:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:59.902 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:59.902 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:25:59.902 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:59.902 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:59.902 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:59.902 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:59.902 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:59.902 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.902 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.902 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:59.902 "name": "raid_bdev1", 00:25:59.902 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:25:59.902 "strip_size_kb": 0, 00:25:59.902 "state": "online", 00:25:59.902 "raid_level": "raid1", 00:25:59.902 "superblock": false, 00:25:59.902 "num_base_bdevs": 4, 00:25:59.902 "num_base_bdevs_discovered": 3, 00:25:59.902 "num_base_bdevs_operational": 3, 00:25:59.902 "base_bdevs_list": [ 00:25:59.902 { 00:25:59.902 "name": "spare", 00:25:59.902 "uuid": "deb239ae-f5e7-5f6e-9156-2da77fa1827f", 00:25:59.902 "is_configured": true, 00:25:59.902 "data_offset": 0, 00:25:59.902 "data_size": 65536 00:25:59.902 }, 00:25:59.902 { 00:25:59.902 "name": null, 00:25:59.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.902 "is_configured": false, 00:25:59.903 "data_offset": 0, 00:25:59.903 "data_size": 65536 00:25:59.903 }, 00:25:59.903 { 00:25:59.903 "name": "BaseBdev3", 00:25:59.903 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:25:59.903 "is_configured": true, 00:25:59.903 "data_offset": 0, 00:25:59.903 "data_size": 65536 00:25:59.903 }, 00:25:59.903 { 00:25:59.903 "name": "BaseBdev4", 00:25:59.903 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:25:59.903 "is_configured": true, 00:25:59.903 "data_offset": 0, 00:25:59.903 "data_size": 65536 00:25:59.903 } 00:25:59.903 ] 00:25:59.903 }' 00:25:59.903 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.170 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.457 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:00.457 "name": "raid_bdev1", 00:26:00.457 "uuid": "1a55c57d-4318-41e7-879f-014a86e4a3ee", 00:26:00.457 "strip_size_kb": 0, 00:26:00.457 "state": "online", 00:26:00.457 "raid_level": "raid1", 00:26:00.457 "superblock": false, 00:26:00.457 "num_base_bdevs": 4, 00:26:00.457 "num_base_bdevs_discovered": 3, 00:26:00.457 "num_base_bdevs_operational": 3, 00:26:00.457 "base_bdevs_list": [ 00:26:00.457 { 00:26:00.457 "name": "spare", 00:26:00.457 "uuid": "deb239ae-f5e7-5f6e-9156-2da77fa1827f", 00:26:00.457 "is_configured": true, 00:26:00.457 "data_offset": 0, 00:26:00.457 "data_size": 65536 00:26:00.457 }, 00:26:00.457 { 00:26:00.457 "name": null, 00:26:00.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:00.457 "is_configured": false, 00:26:00.457 "data_offset": 0, 00:26:00.457 "data_size": 65536 00:26:00.457 }, 00:26:00.457 { 00:26:00.457 "name": "BaseBdev3", 00:26:00.457 "uuid": "7796fc89-b2da-59c0-ab4d-fbd4f6d516c4", 00:26:00.457 "is_configured": true, 00:26:00.457 "data_offset": 0, 00:26:00.457 "data_size": 65536 00:26:00.457 }, 00:26:00.457 { 00:26:00.457 "name": "BaseBdev4", 00:26:00.457 "uuid": "dc71cacf-66af-525e-8244-4a737fe56fb4", 00:26:00.457 "is_configured": true, 00:26:00.457 "data_offset": 0, 00:26:00.457 "data_size": 65536 00:26:00.457 } 00:26:00.457 ] 00:26:00.457 }' 00:26:00.457 20:39:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:00.457 20:39:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:01.041 20:39:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:01.298 [2024-07-15 20:39:53.449482] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:01.298 [2024-07-15 20:39:53.449512] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:01.298 [2024-07-15 20:39:53.449573] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:01.298 [2024-07-15 20:39:53.449650] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:01.298 [2024-07-15 20:39:53.449663] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20098a0 name raid_bdev1, state offline 00:26:01.298 20:39:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.298 20:39:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:01.556 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:01.813 /dev/nbd0 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:01.813 1+0 records in 00:26:01.813 1+0 records out 00:26:01.813 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270073 s, 15.2 MB/s 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:01.813 20:39:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:02.071 /dev/nbd1 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:02.071 1+0 records in 00:26:02.071 1+0 records out 00:26:02.071 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287661 s, 14.2 MB/s 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:02.071 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:02.329 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:02.329 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:02.329 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:02.329 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:02.329 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:02.329 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:02.329 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:02.329 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:02.329 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:02.329 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1476630 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 1476630 ']' 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 1476630 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:02.587 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1476630 00:26:02.845 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:02.846 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:02.846 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1476630' 00:26:02.846 killing process with pid 1476630 00:26:02.846 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 1476630 00:26:02.846 Received shutdown signal, test time was about 60.000000 seconds 00:26:02.846 00:26:02.846 Latency(us) 00:26:02.846 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:02.846 =================================================================================================================== 00:26:02.846 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:02.846 [2024-07-15 20:39:54.975317] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:02.846 20:39:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 1476630 00:26:02.846 [2024-07-15 20:39:55.024853] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:26:03.105 00:26:03.105 real 0m25.614s 00:26:03.105 user 0m33.681s 00:26:03.105 sys 0m6.054s 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:03.105 ************************************ 00:26:03.105 END TEST raid_rebuild_test 00:26:03.105 ************************************ 00:26:03.105 20:39:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:03.105 20:39:55 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:26:03.105 20:39:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:03.105 20:39:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:03.105 20:39:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:03.105 ************************************ 00:26:03.105 START TEST raid_rebuild_test_sb 00:26:03.105 ************************************ 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1480190 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1480190 /var/tmp/spdk-raid.sock 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1480190 ']' 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:03.105 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:03.105 20:39:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:03.105 [2024-07-15 20:39:55.417631] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:26:03.105 [2024-07-15 20:39:55.417704] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1480190 ] 00:26:03.105 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:03.105 Zero copy mechanism will not be used. 00:26:03.364 [2024-07-15 20:39:55.549051] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:03.364 [2024-07-15 20:39:55.656886] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:03.364 [2024-07-15 20:39:55.718125] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:03.364 [2024-07-15 20:39:55.718163] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:04.320 20:39:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:04.320 20:39:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:26:04.320 20:39:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:04.320 20:39:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:04.320 BaseBdev1_malloc 00:26:04.320 20:39:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:04.579 [2024-07-15 20:39:56.834987] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:04.579 [2024-07-15 20:39:56.835039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:04.579 [2024-07-15 20:39:56.835061] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18d4d40 00:26:04.579 [2024-07-15 20:39:56.835074] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:04.579 [2024-07-15 20:39:56.836767] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:04.579 [2024-07-15 20:39:56.836797] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:04.579 BaseBdev1 00:26:04.579 20:39:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:04.579 20:39:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:04.838 BaseBdev2_malloc 00:26:04.838 20:39:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:05.096 [2024-07-15 20:39:57.337176] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:05.096 [2024-07-15 20:39:57.337222] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:05.096 [2024-07-15 20:39:57.337244] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18d5860 00:26:05.096 [2024-07-15 20:39:57.337257] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:05.096 [2024-07-15 20:39:57.338629] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:05.096 [2024-07-15 20:39:57.338656] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:05.096 BaseBdev2 00:26:05.096 20:39:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:05.096 20:39:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:05.356 BaseBdev3_malloc 00:26:05.356 20:39:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:05.615 [2024-07-15 20:39:57.839138] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:05.615 [2024-07-15 20:39:57.839182] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:05.615 [2024-07-15 20:39:57.839201] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a828f0 00:26:05.615 [2024-07-15 20:39:57.839214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:05.615 [2024-07-15 20:39:57.840582] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:05.615 [2024-07-15 20:39:57.840609] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:05.615 BaseBdev3 00:26:05.615 20:39:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:05.615 20:39:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:05.874 BaseBdev4_malloc 00:26:05.874 20:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:06.133 [2024-07-15 20:39:58.340975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:06.133 [2024-07-15 20:39:58.341020] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:06.133 [2024-07-15 20:39:58.341040] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a81ad0 00:26:06.133 [2024-07-15 20:39:58.341052] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:06.133 [2024-07-15 20:39:58.342464] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:06.133 [2024-07-15 20:39:58.342491] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:06.133 BaseBdev4 00:26:06.133 20:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:06.393 spare_malloc 00:26:06.393 20:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:06.651 spare_delay 00:26:06.651 20:39:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:06.910 [2024-07-15 20:39:59.087628] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:06.910 [2024-07-15 20:39:59.087669] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:06.910 [2024-07-15 20:39:59.087688] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a865b0 00:26:06.910 [2024-07-15 20:39:59.087701] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:06.910 [2024-07-15 20:39:59.089187] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:06.910 [2024-07-15 20:39:59.089216] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:06.910 spare 00:26:06.910 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:07.168 [2024-07-15 20:39:59.336322] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:07.168 [2024-07-15 20:39:59.337503] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:07.168 [2024-07-15 20:39:59.337557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:07.168 [2024-07-15 20:39:59.337603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:07.168 [2024-07-15 20:39:59.337802] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a058a0 00:26:07.168 [2024-07-15 20:39:59.337814] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:07.168 [2024-07-15 20:39:59.338014] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a7fe10 00:26:07.168 [2024-07-15 20:39:59.338161] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a058a0 00:26:07.168 [2024-07-15 20:39:59.338171] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a058a0 00:26:07.168 [2024-07-15 20:39:59.338262] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:07.168 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:07.168 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:07.168 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:07.168 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:07.168 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:07.168 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:07.168 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:07.168 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:07.168 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:07.168 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:07.169 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.169 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.427 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:07.427 "name": "raid_bdev1", 00:26:07.427 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:07.427 "strip_size_kb": 0, 00:26:07.427 "state": "online", 00:26:07.427 "raid_level": "raid1", 00:26:07.427 "superblock": true, 00:26:07.427 "num_base_bdevs": 4, 00:26:07.427 "num_base_bdevs_discovered": 4, 00:26:07.427 "num_base_bdevs_operational": 4, 00:26:07.427 "base_bdevs_list": [ 00:26:07.427 { 00:26:07.427 "name": "BaseBdev1", 00:26:07.427 "uuid": "cf7634b9-eac5-54f6-a841-d3e142a789e5", 00:26:07.427 "is_configured": true, 00:26:07.427 "data_offset": 2048, 00:26:07.427 "data_size": 63488 00:26:07.427 }, 00:26:07.427 { 00:26:07.427 "name": "BaseBdev2", 00:26:07.427 "uuid": "276f8e66-c32c-51c4-a0a0-2bf336c18758", 00:26:07.427 "is_configured": true, 00:26:07.427 "data_offset": 2048, 00:26:07.427 "data_size": 63488 00:26:07.427 }, 00:26:07.427 { 00:26:07.427 "name": "BaseBdev3", 00:26:07.427 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:07.427 "is_configured": true, 00:26:07.427 "data_offset": 2048, 00:26:07.427 "data_size": 63488 00:26:07.427 }, 00:26:07.427 { 00:26:07.427 "name": "BaseBdev4", 00:26:07.427 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:07.427 "is_configured": true, 00:26:07.427 "data_offset": 2048, 00:26:07.427 "data_size": 63488 00:26:07.427 } 00:26:07.427 ] 00:26:07.427 }' 00:26:07.427 20:39:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:07.427 20:39:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:07.995 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:07.995 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:08.254 [2024-07-15 20:40:00.439637] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:08.254 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:26:08.254 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.254 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:08.514 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:08.773 [2024-07-15 20:40:00.940703] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a7fe10 00:26:08.773 /dev/nbd0 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:08.773 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:08.774 1+0 records in 00:26:08.774 1+0 records out 00:26:08.774 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000197684 s, 20.7 MB/s 00:26:08.774 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:08.774 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:26:08.774 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:08.774 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:08.774 20:40:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:26:08.774 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:08.774 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:08.774 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:08.774 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:08.774 20:40:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:26:16.888 63488+0 records in 00:26:16.888 63488+0 records out 00:26:16.888 32505856 bytes (33 MB, 31 MiB) copied, 7.93547 s, 4.1 MB/s 00:26:16.888 20:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:16.888 20:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:16.888 20:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:16.888 20:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:16.888 20:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:26:16.888 20:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:16.888 20:40:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:16.888 [2024-07-15 20:40:09.209233] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:16.888 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:16.888 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:16.888 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:16.888 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:16.888 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:16.888 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:16.888 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:16.888 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:16.888 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:17.146 [2024-07-15 20:40:09.449917] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.146 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.404 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:17.404 "name": "raid_bdev1", 00:26:17.404 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:17.404 "strip_size_kb": 0, 00:26:17.404 "state": "online", 00:26:17.404 "raid_level": "raid1", 00:26:17.404 "superblock": true, 00:26:17.404 "num_base_bdevs": 4, 00:26:17.404 "num_base_bdevs_discovered": 3, 00:26:17.404 "num_base_bdevs_operational": 3, 00:26:17.404 "base_bdevs_list": [ 00:26:17.404 { 00:26:17.404 "name": null, 00:26:17.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:17.404 "is_configured": false, 00:26:17.404 "data_offset": 2048, 00:26:17.404 "data_size": 63488 00:26:17.404 }, 00:26:17.404 { 00:26:17.404 "name": "BaseBdev2", 00:26:17.404 "uuid": "276f8e66-c32c-51c4-a0a0-2bf336c18758", 00:26:17.404 "is_configured": true, 00:26:17.404 "data_offset": 2048, 00:26:17.404 "data_size": 63488 00:26:17.404 }, 00:26:17.404 { 00:26:17.404 "name": "BaseBdev3", 00:26:17.404 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:17.404 "is_configured": true, 00:26:17.404 "data_offset": 2048, 00:26:17.404 "data_size": 63488 00:26:17.404 }, 00:26:17.404 { 00:26:17.404 "name": "BaseBdev4", 00:26:17.404 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:17.404 "is_configured": true, 00:26:17.404 "data_offset": 2048, 00:26:17.404 "data_size": 63488 00:26:17.404 } 00:26:17.404 ] 00:26:17.404 }' 00:26:17.404 20:40:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:17.404 20:40:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:17.971 20:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:18.230 [2024-07-15 20:40:10.561000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:18.230 [2024-07-15 20:40:10.565124] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a7fe10 00:26:18.230 [2024-07-15 20:40:10.567505] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:18.230 20:40:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:19.607 "name": "raid_bdev1", 00:26:19.607 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:19.607 "strip_size_kb": 0, 00:26:19.607 "state": "online", 00:26:19.607 "raid_level": "raid1", 00:26:19.607 "superblock": true, 00:26:19.607 "num_base_bdevs": 4, 00:26:19.607 "num_base_bdevs_discovered": 4, 00:26:19.607 "num_base_bdevs_operational": 4, 00:26:19.607 "process": { 00:26:19.607 "type": "rebuild", 00:26:19.607 "target": "spare", 00:26:19.607 "progress": { 00:26:19.607 "blocks": 24576, 00:26:19.607 "percent": 38 00:26:19.607 } 00:26:19.607 }, 00:26:19.607 "base_bdevs_list": [ 00:26:19.607 { 00:26:19.607 "name": "spare", 00:26:19.607 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:19.607 "is_configured": true, 00:26:19.607 "data_offset": 2048, 00:26:19.607 "data_size": 63488 00:26:19.607 }, 00:26:19.607 { 00:26:19.607 "name": "BaseBdev2", 00:26:19.607 "uuid": "276f8e66-c32c-51c4-a0a0-2bf336c18758", 00:26:19.607 "is_configured": true, 00:26:19.607 "data_offset": 2048, 00:26:19.607 "data_size": 63488 00:26:19.607 }, 00:26:19.607 { 00:26:19.607 "name": "BaseBdev3", 00:26:19.607 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:19.607 "is_configured": true, 00:26:19.607 "data_offset": 2048, 00:26:19.607 "data_size": 63488 00:26:19.607 }, 00:26:19.607 { 00:26:19.607 "name": "BaseBdev4", 00:26:19.607 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:19.607 "is_configured": true, 00:26:19.607 "data_offset": 2048, 00:26:19.607 "data_size": 63488 00:26:19.607 } 00:26:19.607 ] 00:26:19.607 }' 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:19.607 20:40:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:19.866 [2024-07-15 20:40:12.149896] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:19.866 [2024-07-15 20:40:12.180156] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:19.866 [2024-07-15 20:40:12.180202] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:19.866 [2024-07-15 20:40:12.180219] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:19.866 [2024-07-15 20:40:12.180228] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.866 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.432 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:20.432 "name": "raid_bdev1", 00:26:20.432 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:20.432 "strip_size_kb": 0, 00:26:20.432 "state": "online", 00:26:20.432 "raid_level": "raid1", 00:26:20.433 "superblock": true, 00:26:20.433 "num_base_bdevs": 4, 00:26:20.433 "num_base_bdevs_discovered": 3, 00:26:20.433 "num_base_bdevs_operational": 3, 00:26:20.433 "base_bdevs_list": [ 00:26:20.433 { 00:26:20.433 "name": null, 00:26:20.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:20.433 "is_configured": false, 00:26:20.433 "data_offset": 2048, 00:26:20.433 "data_size": 63488 00:26:20.433 }, 00:26:20.433 { 00:26:20.433 "name": "BaseBdev2", 00:26:20.433 "uuid": "276f8e66-c32c-51c4-a0a0-2bf336c18758", 00:26:20.433 "is_configured": true, 00:26:20.433 "data_offset": 2048, 00:26:20.433 "data_size": 63488 00:26:20.433 }, 00:26:20.433 { 00:26:20.433 "name": "BaseBdev3", 00:26:20.433 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:20.433 "is_configured": true, 00:26:20.433 "data_offset": 2048, 00:26:20.433 "data_size": 63488 00:26:20.433 }, 00:26:20.433 { 00:26:20.433 "name": "BaseBdev4", 00:26:20.433 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:20.433 "is_configured": true, 00:26:20.433 "data_offset": 2048, 00:26:20.433 "data_size": 63488 00:26:20.433 } 00:26:20.433 ] 00:26:20.433 }' 00:26:20.433 20:40:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:20.433 20:40:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:20.998 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:20.998 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:20.998 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:20.998 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:20.998 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:20.998 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.998 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.255 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:21.255 "name": "raid_bdev1", 00:26:21.255 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:21.255 "strip_size_kb": 0, 00:26:21.255 "state": "online", 00:26:21.255 "raid_level": "raid1", 00:26:21.255 "superblock": true, 00:26:21.255 "num_base_bdevs": 4, 00:26:21.255 "num_base_bdevs_discovered": 3, 00:26:21.255 "num_base_bdevs_operational": 3, 00:26:21.255 "base_bdevs_list": [ 00:26:21.255 { 00:26:21.255 "name": null, 00:26:21.255 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:21.255 "is_configured": false, 00:26:21.255 "data_offset": 2048, 00:26:21.255 "data_size": 63488 00:26:21.255 }, 00:26:21.255 { 00:26:21.255 "name": "BaseBdev2", 00:26:21.255 "uuid": "276f8e66-c32c-51c4-a0a0-2bf336c18758", 00:26:21.255 "is_configured": true, 00:26:21.255 "data_offset": 2048, 00:26:21.255 "data_size": 63488 00:26:21.255 }, 00:26:21.255 { 00:26:21.255 "name": "BaseBdev3", 00:26:21.255 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:21.255 "is_configured": true, 00:26:21.255 "data_offset": 2048, 00:26:21.255 "data_size": 63488 00:26:21.255 }, 00:26:21.255 { 00:26:21.255 "name": "BaseBdev4", 00:26:21.255 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:21.255 "is_configured": true, 00:26:21.255 "data_offset": 2048, 00:26:21.255 "data_size": 63488 00:26:21.255 } 00:26:21.255 ] 00:26:21.255 }' 00:26:21.255 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:21.255 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:21.255 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:21.255 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:21.255 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:21.513 [2024-07-15 20:40:13.712280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:21.513 [2024-07-15 20:40:13.716366] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a05e90 00:26:21.513 [2024-07-15 20:40:13.717873] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:21.513 20:40:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:22.447 20:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:22.447 20:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:22.447 20:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:22.447 20:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:22.447 20:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:22.447 20:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.447 20:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.785 20:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:22.785 "name": "raid_bdev1", 00:26:22.785 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:22.785 "strip_size_kb": 0, 00:26:22.785 "state": "online", 00:26:22.785 "raid_level": "raid1", 00:26:22.785 "superblock": true, 00:26:22.785 "num_base_bdevs": 4, 00:26:22.785 "num_base_bdevs_discovered": 4, 00:26:22.785 "num_base_bdevs_operational": 4, 00:26:22.785 "process": { 00:26:22.785 "type": "rebuild", 00:26:22.785 "target": "spare", 00:26:22.785 "progress": { 00:26:22.785 "blocks": 24576, 00:26:22.785 "percent": 38 00:26:22.785 } 00:26:22.785 }, 00:26:22.785 "base_bdevs_list": [ 00:26:22.785 { 00:26:22.786 "name": "spare", 00:26:22.786 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:22.786 "is_configured": true, 00:26:22.786 "data_offset": 2048, 00:26:22.786 "data_size": 63488 00:26:22.786 }, 00:26:22.786 { 00:26:22.786 "name": "BaseBdev2", 00:26:22.786 "uuid": "276f8e66-c32c-51c4-a0a0-2bf336c18758", 00:26:22.786 "is_configured": true, 00:26:22.786 "data_offset": 2048, 00:26:22.786 "data_size": 63488 00:26:22.786 }, 00:26:22.786 { 00:26:22.786 "name": "BaseBdev3", 00:26:22.786 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:22.786 "is_configured": true, 00:26:22.786 "data_offset": 2048, 00:26:22.786 "data_size": 63488 00:26:22.786 }, 00:26:22.786 { 00:26:22.786 "name": "BaseBdev4", 00:26:22.786 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:22.786 "is_configured": true, 00:26:22.786 "data_offset": 2048, 00:26:22.786 "data_size": 63488 00:26:22.786 } 00:26:22.786 ] 00:26:22.786 }' 00:26:22.786 20:40:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:22.786 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:22.786 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:22.786 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:22.786 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:22.786 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:22.786 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:22.786 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:22.786 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:22.786 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:22.786 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:23.052 [2024-07-15 20:40:15.305467] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:23.052 [2024-07-15 20:40:15.430843] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1a05e90 00:26:23.311 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:23.311 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:23.311 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:23.311 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:23.311 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:23.311 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:23.311 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:23.311 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.311 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:23.570 "name": "raid_bdev1", 00:26:23.570 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:23.570 "strip_size_kb": 0, 00:26:23.570 "state": "online", 00:26:23.570 "raid_level": "raid1", 00:26:23.570 "superblock": true, 00:26:23.570 "num_base_bdevs": 4, 00:26:23.570 "num_base_bdevs_discovered": 3, 00:26:23.570 "num_base_bdevs_operational": 3, 00:26:23.570 "process": { 00:26:23.570 "type": "rebuild", 00:26:23.570 "target": "spare", 00:26:23.570 "progress": { 00:26:23.570 "blocks": 36864, 00:26:23.570 "percent": 58 00:26:23.570 } 00:26:23.570 }, 00:26:23.570 "base_bdevs_list": [ 00:26:23.570 { 00:26:23.570 "name": "spare", 00:26:23.570 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:23.570 "is_configured": true, 00:26:23.570 "data_offset": 2048, 00:26:23.570 "data_size": 63488 00:26:23.570 }, 00:26:23.570 { 00:26:23.570 "name": null, 00:26:23.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:23.570 "is_configured": false, 00:26:23.570 "data_offset": 2048, 00:26:23.570 "data_size": 63488 00:26:23.570 }, 00:26:23.570 { 00:26:23.570 "name": "BaseBdev3", 00:26:23.570 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:23.570 "is_configured": true, 00:26:23.570 "data_offset": 2048, 00:26:23.570 "data_size": 63488 00:26:23.570 }, 00:26:23.570 { 00:26:23.570 "name": "BaseBdev4", 00:26:23.570 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:23.570 "is_configured": true, 00:26:23.570 "data_offset": 2048, 00:26:23.570 "data_size": 63488 00:26:23.570 } 00:26:23.570 ] 00:26:23.570 }' 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=953 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.570 20:40:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.829 20:40:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:23.829 "name": "raid_bdev1", 00:26:23.829 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:23.829 "strip_size_kb": 0, 00:26:23.829 "state": "online", 00:26:23.829 "raid_level": "raid1", 00:26:23.829 "superblock": true, 00:26:23.829 "num_base_bdevs": 4, 00:26:23.829 "num_base_bdevs_discovered": 3, 00:26:23.830 "num_base_bdevs_operational": 3, 00:26:23.830 "process": { 00:26:23.830 "type": "rebuild", 00:26:23.830 "target": "spare", 00:26:23.830 "progress": { 00:26:23.830 "blocks": 45056, 00:26:23.830 "percent": 70 00:26:23.830 } 00:26:23.830 }, 00:26:23.830 "base_bdevs_list": [ 00:26:23.830 { 00:26:23.830 "name": "spare", 00:26:23.830 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:23.830 "is_configured": true, 00:26:23.830 "data_offset": 2048, 00:26:23.830 "data_size": 63488 00:26:23.830 }, 00:26:23.830 { 00:26:23.830 "name": null, 00:26:23.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:23.830 "is_configured": false, 00:26:23.830 "data_offset": 2048, 00:26:23.830 "data_size": 63488 00:26:23.830 }, 00:26:23.830 { 00:26:23.830 "name": "BaseBdev3", 00:26:23.830 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:23.830 "is_configured": true, 00:26:23.830 "data_offset": 2048, 00:26:23.830 "data_size": 63488 00:26:23.830 }, 00:26:23.830 { 00:26:23.830 "name": "BaseBdev4", 00:26:23.830 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:23.830 "is_configured": true, 00:26:23.830 "data_offset": 2048, 00:26:23.830 "data_size": 63488 00:26:23.830 } 00:26:23.830 ] 00:26:23.830 }' 00:26:23.830 20:40:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:23.830 20:40:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:23.830 20:40:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:23.830 20:40:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:23.830 20:40:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:24.766 [2024-07-15 20:40:16.942635] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:24.766 [2024-07-15 20:40:16.942701] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:24.766 [2024-07-15 20:40:16.942806] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:25.025 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:25.025 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:25.025 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:25.025 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:25.025 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:25.025 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:25.025 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.025 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.284 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:25.284 "name": "raid_bdev1", 00:26:25.284 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:25.284 "strip_size_kb": 0, 00:26:25.284 "state": "online", 00:26:25.284 "raid_level": "raid1", 00:26:25.284 "superblock": true, 00:26:25.284 "num_base_bdevs": 4, 00:26:25.284 "num_base_bdevs_discovered": 3, 00:26:25.284 "num_base_bdevs_operational": 3, 00:26:25.284 "base_bdevs_list": [ 00:26:25.284 { 00:26:25.284 "name": "spare", 00:26:25.284 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:25.284 "is_configured": true, 00:26:25.284 "data_offset": 2048, 00:26:25.284 "data_size": 63488 00:26:25.284 }, 00:26:25.284 { 00:26:25.284 "name": null, 00:26:25.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.284 "is_configured": false, 00:26:25.284 "data_offset": 2048, 00:26:25.284 "data_size": 63488 00:26:25.284 }, 00:26:25.284 { 00:26:25.284 "name": "BaseBdev3", 00:26:25.284 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:25.284 "is_configured": true, 00:26:25.284 "data_offset": 2048, 00:26:25.284 "data_size": 63488 00:26:25.284 }, 00:26:25.284 { 00:26:25.284 "name": "BaseBdev4", 00:26:25.284 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:25.284 "is_configured": true, 00:26:25.284 "data_offset": 2048, 00:26:25.284 "data_size": 63488 00:26:25.284 } 00:26:25.284 ] 00:26:25.284 }' 00:26:25.284 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.285 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.544 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:25.544 "name": "raid_bdev1", 00:26:25.544 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:25.544 "strip_size_kb": 0, 00:26:25.544 "state": "online", 00:26:25.544 "raid_level": "raid1", 00:26:25.544 "superblock": true, 00:26:25.544 "num_base_bdevs": 4, 00:26:25.544 "num_base_bdevs_discovered": 3, 00:26:25.544 "num_base_bdevs_operational": 3, 00:26:25.544 "base_bdevs_list": [ 00:26:25.544 { 00:26:25.544 "name": "spare", 00:26:25.544 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:25.544 "is_configured": true, 00:26:25.544 "data_offset": 2048, 00:26:25.544 "data_size": 63488 00:26:25.544 }, 00:26:25.544 { 00:26:25.544 "name": null, 00:26:25.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.544 "is_configured": false, 00:26:25.544 "data_offset": 2048, 00:26:25.544 "data_size": 63488 00:26:25.544 }, 00:26:25.544 { 00:26:25.544 "name": "BaseBdev3", 00:26:25.544 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:25.544 "is_configured": true, 00:26:25.544 "data_offset": 2048, 00:26:25.545 "data_size": 63488 00:26:25.545 }, 00:26:25.545 { 00:26:25.545 "name": "BaseBdev4", 00:26:25.545 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:25.545 "is_configured": true, 00:26:25.545 "data_offset": 2048, 00:26:25.545 "data_size": 63488 00:26:25.545 } 00:26:25.545 ] 00:26:25.545 }' 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.545 20:40:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.804 20:40:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:25.804 "name": "raid_bdev1", 00:26:25.804 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:25.804 "strip_size_kb": 0, 00:26:25.804 "state": "online", 00:26:25.804 "raid_level": "raid1", 00:26:25.804 "superblock": true, 00:26:25.804 "num_base_bdevs": 4, 00:26:25.804 "num_base_bdevs_discovered": 3, 00:26:25.804 "num_base_bdevs_operational": 3, 00:26:25.804 "base_bdevs_list": [ 00:26:25.804 { 00:26:25.804 "name": "spare", 00:26:25.804 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:25.804 "is_configured": true, 00:26:25.804 "data_offset": 2048, 00:26:25.804 "data_size": 63488 00:26:25.804 }, 00:26:25.804 { 00:26:25.804 "name": null, 00:26:25.804 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.804 "is_configured": false, 00:26:25.804 "data_offset": 2048, 00:26:25.804 "data_size": 63488 00:26:25.804 }, 00:26:25.804 { 00:26:25.804 "name": "BaseBdev3", 00:26:25.804 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:25.804 "is_configured": true, 00:26:25.804 "data_offset": 2048, 00:26:25.804 "data_size": 63488 00:26:25.804 }, 00:26:25.804 { 00:26:25.804 "name": "BaseBdev4", 00:26:25.804 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:25.804 "is_configured": true, 00:26:25.804 "data_offset": 2048, 00:26:25.804 "data_size": 63488 00:26:25.804 } 00:26:25.804 ] 00:26:25.804 }' 00:26:25.804 20:40:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:25.804 20:40:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:26.740 20:40:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:26.740 [2024-07-15 20:40:19.012680] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:26.740 [2024-07-15 20:40:19.012709] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:26.740 [2024-07-15 20:40:19.012769] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:26.740 [2024-07-15 20:40:19.012846] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:26.740 [2024-07-15 20:40:19.012858] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a058a0 name raid_bdev1, state offline 00:26:26.740 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.740 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:26.998 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:27.256 /dev/nbd0 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:27.256 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:27.256 1+0 records in 00:26:27.256 1+0 records out 00:26:27.257 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260017 s, 15.8 MB/s 00:26:27.257 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:27.257 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:26:27.257 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:27.257 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:27.257 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:26:27.257 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:27.257 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:27.257 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:27.515 /dev/nbd1 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:27.515 1+0 records in 00:26:27.515 1+0 records out 00:26:27.515 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000357744 s, 11.4 MB/s 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:27.515 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:27.773 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:27.773 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:27.774 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:27.774 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:27.774 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:26:27.774 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:27.774 20:40:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:28.032 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:28.032 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:28.032 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:28.032 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:28.032 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:28.032 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:28.032 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:28.032 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:28.032 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:28.032 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:28.290 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:28.290 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:28.290 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:28.290 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:28.290 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:28.290 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:28.290 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:28.290 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:28.290 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:28.290 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:28.549 20:40:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:28.808 [2024-07-15 20:40:20.980197] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:28.808 [2024-07-15 20:40:20.980245] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:28.808 [2024-07-15 20:40:20.980267] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a7fb40 00:26:28.808 [2024-07-15 20:40:20.980280] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:28.808 [2024-07-15 20:40:20.981940] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:28.808 [2024-07-15 20:40:20.981967] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:28.808 [2024-07-15 20:40:20.982048] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:28.808 [2024-07-15 20:40:20.982077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:28.808 [2024-07-15 20:40:20.982182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:28.808 [2024-07-15 20:40:20.982258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:28.808 spare 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.808 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:28.808 [2024-07-15 20:40:21.082576] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a09ec0 00:26:28.809 [2024-07-15 20:40:21.082594] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:28.809 [2024-07-15 20:40:21.082804] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a7f430 00:26:28.809 [2024-07-15 20:40:21.082967] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a09ec0 00:26:28.809 [2024-07-15 20:40:21.082978] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a09ec0 00:26:28.809 [2024-07-15 20:40:21.083085] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:29.068 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.068 "name": "raid_bdev1", 00:26:29.068 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:29.068 "strip_size_kb": 0, 00:26:29.068 "state": "online", 00:26:29.068 "raid_level": "raid1", 00:26:29.068 "superblock": true, 00:26:29.068 "num_base_bdevs": 4, 00:26:29.068 "num_base_bdevs_discovered": 3, 00:26:29.068 "num_base_bdevs_operational": 3, 00:26:29.068 "base_bdevs_list": [ 00:26:29.068 { 00:26:29.068 "name": "spare", 00:26:29.068 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:29.068 "is_configured": true, 00:26:29.068 "data_offset": 2048, 00:26:29.068 "data_size": 63488 00:26:29.068 }, 00:26:29.068 { 00:26:29.068 "name": null, 00:26:29.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.068 "is_configured": false, 00:26:29.068 "data_offset": 2048, 00:26:29.068 "data_size": 63488 00:26:29.068 }, 00:26:29.068 { 00:26:29.068 "name": "BaseBdev3", 00:26:29.068 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:29.068 "is_configured": true, 00:26:29.068 "data_offset": 2048, 00:26:29.068 "data_size": 63488 00:26:29.068 }, 00:26:29.068 { 00:26:29.068 "name": "BaseBdev4", 00:26:29.068 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:29.068 "is_configured": true, 00:26:29.068 "data_offset": 2048, 00:26:29.068 "data_size": 63488 00:26:29.068 } 00:26:29.068 ] 00:26:29.068 }' 00:26:29.068 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.068 20:40:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:29.635 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:29.635 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:29.635 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:29.635 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:29.635 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:29.635 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.635 20:40:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.893 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:29.893 "name": "raid_bdev1", 00:26:29.893 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:29.893 "strip_size_kb": 0, 00:26:29.893 "state": "online", 00:26:29.893 "raid_level": "raid1", 00:26:29.893 "superblock": true, 00:26:29.893 "num_base_bdevs": 4, 00:26:29.893 "num_base_bdevs_discovered": 3, 00:26:29.893 "num_base_bdevs_operational": 3, 00:26:29.893 "base_bdevs_list": [ 00:26:29.893 { 00:26:29.893 "name": "spare", 00:26:29.893 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:29.893 "is_configured": true, 00:26:29.893 "data_offset": 2048, 00:26:29.893 "data_size": 63488 00:26:29.893 }, 00:26:29.893 { 00:26:29.893 "name": null, 00:26:29.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.893 "is_configured": false, 00:26:29.893 "data_offset": 2048, 00:26:29.893 "data_size": 63488 00:26:29.893 }, 00:26:29.893 { 00:26:29.893 "name": "BaseBdev3", 00:26:29.893 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:29.893 "is_configured": true, 00:26:29.893 "data_offset": 2048, 00:26:29.893 "data_size": 63488 00:26:29.893 }, 00:26:29.893 { 00:26:29.893 "name": "BaseBdev4", 00:26:29.893 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:29.893 "is_configured": true, 00:26:29.893 "data_offset": 2048, 00:26:29.893 "data_size": 63488 00:26:29.893 } 00:26:29.893 ] 00:26:29.893 }' 00:26:29.893 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:29.893 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:29.893 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:29.893 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:29.893 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.893 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:30.152 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:30.152 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:30.412 [2024-07-15 20:40:22.640724] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.412 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:30.671 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:30.672 "name": "raid_bdev1", 00:26:30.672 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:30.672 "strip_size_kb": 0, 00:26:30.672 "state": "online", 00:26:30.672 "raid_level": "raid1", 00:26:30.672 "superblock": true, 00:26:30.672 "num_base_bdevs": 4, 00:26:30.672 "num_base_bdevs_discovered": 2, 00:26:30.672 "num_base_bdevs_operational": 2, 00:26:30.672 "base_bdevs_list": [ 00:26:30.672 { 00:26:30.672 "name": null, 00:26:30.672 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.672 "is_configured": false, 00:26:30.672 "data_offset": 2048, 00:26:30.672 "data_size": 63488 00:26:30.672 }, 00:26:30.672 { 00:26:30.672 "name": null, 00:26:30.672 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.672 "is_configured": false, 00:26:30.672 "data_offset": 2048, 00:26:30.672 "data_size": 63488 00:26:30.672 }, 00:26:30.672 { 00:26:30.672 "name": "BaseBdev3", 00:26:30.672 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:30.672 "is_configured": true, 00:26:30.672 "data_offset": 2048, 00:26:30.672 "data_size": 63488 00:26:30.672 }, 00:26:30.672 { 00:26:30.672 "name": "BaseBdev4", 00:26:30.672 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:30.672 "is_configured": true, 00:26:30.672 "data_offset": 2048, 00:26:30.672 "data_size": 63488 00:26:30.672 } 00:26:30.672 ] 00:26:30.672 }' 00:26:30.672 20:40:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:30.672 20:40:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:31.239 20:40:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:31.497 [2024-07-15 20:40:23.723620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:31.497 [2024-07-15 20:40:23.723779] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:31.497 [2024-07-15 20:40:23.723797] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:31.497 [2024-07-15 20:40:23.723826] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:31.497 [2024-07-15 20:40:23.727833] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a856d0 00:26:31.497 [2024-07-15 20:40:23.730242] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:31.497 20:40:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:32.432 20:40:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:32.432 20:40:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:32.432 20:40:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:32.432 20:40:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:32.432 20:40:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:32.432 20:40:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.432 20:40:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:32.691 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:32.691 "name": "raid_bdev1", 00:26:32.691 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:32.691 "strip_size_kb": 0, 00:26:32.691 "state": "online", 00:26:32.691 "raid_level": "raid1", 00:26:32.691 "superblock": true, 00:26:32.691 "num_base_bdevs": 4, 00:26:32.691 "num_base_bdevs_discovered": 3, 00:26:32.691 "num_base_bdevs_operational": 3, 00:26:32.691 "process": { 00:26:32.691 "type": "rebuild", 00:26:32.691 "target": "spare", 00:26:32.691 "progress": { 00:26:32.691 "blocks": 24576, 00:26:32.691 "percent": 38 00:26:32.691 } 00:26:32.691 }, 00:26:32.691 "base_bdevs_list": [ 00:26:32.691 { 00:26:32.691 "name": "spare", 00:26:32.691 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:32.691 "is_configured": true, 00:26:32.691 "data_offset": 2048, 00:26:32.691 "data_size": 63488 00:26:32.691 }, 00:26:32.691 { 00:26:32.691 "name": null, 00:26:32.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:32.691 "is_configured": false, 00:26:32.691 "data_offset": 2048, 00:26:32.691 "data_size": 63488 00:26:32.691 }, 00:26:32.691 { 00:26:32.691 "name": "BaseBdev3", 00:26:32.691 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:32.691 "is_configured": true, 00:26:32.691 "data_offset": 2048, 00:26:32.691 "data_size": 63488 00:26:32.691 }, 00:26:32.691 { 00:26:32.691 "name": "BaseBdev4", 00:26:32.691 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:32.691 "is_configured": true, 00:26:32.691 "data_offset": 2048, 00:26:32.691 "data_size": 63488 00:26:32.691 } 00:26:32.691 ] 00:26:32.691 }' 00:26:32.691 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:32.692 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:32.692 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:32.951 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:32.951 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:32.951 [2024-07-15 20:40:25.325550] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:33.210 [2024-07-15 20:40:25.343029] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:33.210 [2024-07-15 20:40:25.343074] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:33.210 [2024-07-15 20:40:25.343090] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:33.210 [2024-07-15 20:40:25.343099] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.210 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:33.469 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:33.469 "name": "raid_bdev1", 00:26:33.469 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:33.469 "strip_size_kb": 0, 00:26:33.469 "state": "online", 00:26:33.469 "raid_level": "raid1", 00:26:33.469 "superblock": true, 00:26:33.469 "num_base_bdevs": 4, 00:26:33.469 "num_base_bdevs_discovered": 2, 00:26:33.469 "num_base_bdevs_operational": 2, 00:26:33.469 "base_bdevs_list": [ 00:26:33.469 { 00:26:33.469 "name": null, 00:26:33.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:33.469 "is_configured": false, 00:26:33.469 "data_offset": 2048, 00:26:33.469 "data_size": 63488 00:26:33.469 }, 00:26:33.469 { 00:26:33.469 "name": null, 00:26:33.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:33.469 "is_configured": false, 00:26:33.469 "data_offset": 2048, 00:26:33.469 "data_size": 63488 00:26:33.469 }, 00:26:33.469 { 00:26:33.469 "name": "BaseBdev3", 00:26:33.469 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:33.469 "is_configured": true, 00:26:33.469 "data_offset": 2048, 00:26:33.469 "data_size": 63488 00:26:33.469 }, 00:26:33.469 { 00:26:33.469 "name": "BaseBdev4", 00:26:33.469 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:33.469 "is_configured": true, 00:26:33.469 "data_offset": 2048, 00:26:33.469 "data_size": 63488 00:26:33.469 } 00:26:33.469 ] 00:26:33.469 }' 00:26:33.469 20:40:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:33.469 20:40:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:34.037 20:40:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:34.296 [2024-07-15 20:40:26.454614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:34.296 [2024-07-15 20:40:26.454670] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:34.296 [2024-07-15 20:40:26.454692] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a07dc0 00:26:34.296 [2024-07-15 20:40:26.454705] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:34.296 [2024-07-15 20:40:26.455105] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:34.296 [2024-07-15 20:40:26.455124] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:34.296 [2024-07-15 20:40:26.455206] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:34.296 [2024-07-15 20:40:26.455220] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:34.296 [2024-07-15 20:40:26.455231] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:34.296 [2024-07-15 20:40:26.455250] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:34.296 [2024-07-15 20:40:26.459313] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a7fdd0 00:26:34.296 spare 00:26:34.296 [2024-07-15 20:40:26.460717] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:34.296 20:40:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:35.233 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:35.233 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:35.233 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:35.233 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:35.233 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:35.233 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.233 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:35.491 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:35.491 "name": "raid_bdev1", 00:26:35.491 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:35.491 "strip_size_kb": 0, 00:26:35.491 "state": "online", 00:26:35.491 "raid_level": "raid1", 00:26:35.491 "superblock": true, 00:26:35.491 "num_base_bdevs": 4, 00:26:35.491 "num_base_bdevs_discovered": 3, 00:26:35.491 "num_base_bdevs_operational": 3, 00:26:35.491 "process": { 00:26:35.491 "type": "rebuild", 00:26:35.491 "target": "spare", 00:26:35.491 "progress": { 00:26:35.491 "blocks": 24576, 00:26:35.491 "percent": 38 00:26:35.491 } 00:26:35.491 }, 00:26:35.491 "base_bdevs_list": [ 00:26:35.491 { 00:26:35.491 "name": "spare", 00:26:35.491 "uuid": "d586d9af-8f57-5741-86de-9b22cdc3a1bd", 00:26:35.491 "is_configured": true, 00:26:35.491 "data_offset": 2048, 00:26:35.491 "data_size": 63488 00:26:35.491 }, 00:26:35.491 { 00:26:35.491 "name": null, 00:26:35.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:35.491 "is_configured": false, 00:26:35.491 "data_offset": 2048, 00:26:35.491 "data_size": 63488 00:26:35.491 }, 00:26:35.491 { 00:26:35.491 "name": "BaseBdev3", 00:26:35.491 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:35.491 "is_configured": true, 00:26:35.491 "data_offset": 2048, 00:26:35.491 "data_size": 63488 00:26:35.491 }, 00:26:35.491 { 00:26:35.491 "name": "BaseBdev4", 00:26:35.491 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:35.491 "is_configured": true, 00:26:35.491 "data_offset": 2048, 00:26:35.491 "data_size": 63488 00:26:35.491 } 00:26:35.491 ] 00:26:35.491 }' 00:26:35.491 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:35.491 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:35.491 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:35.491 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:35.491 20:40:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:35.750 [2024-07-15 20:40:28.041300] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:35.750 [2024-07-15 20:40:28.073332] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:35.750 [2024-07-15 20:40:28.073377] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:35.750 [2024-07-15 20:40:28.073393] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:35.750 [2024-07-15 20:40:28.073402] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.750 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.008 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:36.008 "name": "raid_bdev1", 00:26:36.008 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:36.008 "strip_size_kb": 0, 00:26:36.008 "state": "online", 00:26:36.008 "raid_level": "raid1", 00:26:36.008 "superblock": true, 00:26:36.008 "num_base_bdevs": 4, 00:26:36.008 "num_base_bdevs_discovered": 2, 00:26:36.008 "num_base_bdevs_operational": 2, 00:26:36.008 "base_bdevs_list": [ 00:26:36.008 { 00:26:36.008 "name": null, 00:26:36.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:36.008 "is_configured": false, 00:26:36.008 "data_offset": 2048, 00:26:36.008 "data_size": 63488 00:26:36.008 }, 00:26:36.008 { 00:26:36.008 "name": null, 00:26:36.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:36.008 "is_configured": false, 00:26:36.008 "data_offset": 2048, 00:26:36.009 "data_size": 63488 00:26:36.009 }, 00:26:36.009 { 00:26:36.009 "name": "BaseBdev3", 00:26:36.009 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:36.009 "is_configured": true, 00:26:36.009 "data_offset": 2048, 00:26:36.009 "data_size": 63488 00:26:36.009 }, 00:26:36.009 { 00:26:36.009 "name": "BaseBdev4", 00:26:36.009 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:36.009 "is_configured": true, 00:26:36.009 "data_offset": 2048, 00:26:36.009 "data_size": 63488 00:26:36.009 } 00:26:36.009 ] 00:26:36.009 }' 00:26:36.009 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:36.009 20:40:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:36.941 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:36.941 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:36.941 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:36.941 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:36.941 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:36.941 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.941 20:40:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.941 20:40:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:36.941 "name": "raid_bdev1", 00:26:36.941 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:36.941 "strip_size_kb": 0, 00:26:36.941 "state": "online", 00:26:36.941 "raid_level": "raid1", 00:26:36.941 "superblock": true, 00:26:36.941 "num_base_bdevs": 4, 00:26:36.941 "num_base_bdevs_discovered": 2, 00:26:36.941 "num_base_bdevs_operational": 2, 00:26:36.941 "base_bdevs_list": [ 00:26:36.941 { 00:26:36.941 "name": null, 00:26:36.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:36.941 "is_configured": false, 00:26:36.941 "data_offset": 2048, 00:26:36.941 "data_size": 63488 00:26:36.941 }, 00:26:36.941 { 00:26:36.941 "name": null, 00:26:36.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:36.941 "is_configured": false, 00:26:36.941 "data_offset": 2048, 00:26:36.941 "data_size": 63488 00:26:36.941 }, 00:26:36.941 { 00:26:36.941 "name": "BaseBdev3", 00:26:36.941 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:36.941 "is_configured": true, 00:26:36.941 "data_offset": 2048, 00:26:36.942 "data_size": 63488 00:26:36.942 }, 00:26:36.942 { 00:26:36.942 "name": "BaseBdev4", 00:26:36.942 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:36.942 "is_configured": true, 00:26:36.942 "data_offset": 2048, 00:26:36.942 "data_size": 63488 00:26:36.942 } 00:26:36.942 ] 00:26:36.942 }' 00:26:36.942 20:40:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:36.942 20:40:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:36.942 20:40:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:36.942 20:40:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:36.942 20:40:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:37.200 20:40:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:37.457 [2024-07-15 20:40:29.753855] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:37.457 [2024-07-15 20:40:29.753904] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:37.457 [2024-07-15 20:40:29.753923] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18cb930 00:26:37.457 [2024-07-15 20:40:29.753942] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:37.457 [2024-07-15 20:40:29.754295] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:37.457 [2024-07-15 20:40:29.754314] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:37.457 [2024-07-15 20:40:29.754380] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:37.457 [2024-07-15 20:40:29.754393] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:37.457 [2024-07-15 20:40:29.754404] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:37.457 BaseBdev1 00:26:37.457 20:40:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.828 20:40:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.828 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:38.828 "name": "raid_bdev1", 00:26:38.828 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:38.828 "strip_size_kb": 0, 00:26:38.828 "state": "online", 00:26:38.828 "raid_level": "raid1", 00:26:38.828 "superblock": true, 00:26:38.828 "num_base_bdevs": 4, 00:26:38.828 "num_base_bdevs_discovered": 2, 00:26:38.828 "num_base_bdevs_operational": 2, 00:26:38.828 "base_bdevs_list": [ 00:26:38.828 { 00:26:38.828 "name": null, 00:26:38.828 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.828 "is_configured": false, 00:26:38.828 "data_offset": 2048, 00:26:38.828 "data_size": 63488 00:26:38.828 }, 00:26:38.828 { 00:26:38.828 "name": null, 00:26:38.828 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.828 "is_configured": false, 00:26:38.828 "data_offset": 2048, 00:26:38.828 "data_size": 63488 00:26:38.828 }, 00:26:38.828 { 00:26:38.828 "name": "BaseBdev3", 00:26:38.828 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:38.828 "is_configured": true, 00:26:38.828 "data_offset": 2048, 00:26:38.828 "data_size": 63488 00:26:38.828 }, 00:26:38.828 { 00:26:38.828 "name": "BaseBdev4", 00:26:38.828 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:38.828 "is_configured": true, 00:26:38.828 "data_offset": 2048, 00:26:38.828 "data_size": 63488 00:26:38.828 } 00:26:38.828 ] 00:26:38.828 }' 00:26:38.828 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:38.828 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:39.393 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:39.393 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:39.393 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:39.393 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:39.393 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:39.393 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.393 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:39.652 "name": "raid_bdev1", 00:26:39.652 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:39.652 "strip_size_kb": 0, 00:26:39.652 "state": "online", 00:26:39.652 "raid_level": "raid1", 00:26:39.652 "superblock": true, 00:26:39.652 "num_base_bdevs": 4, 00:26:39.652 "num_base_bdevs_discovered": 2, 00:26:39.652 "num_base_bdevs_operational": 2, 00:26:39.652 "base_bdevs_list": [ 00:26:39.652 { 00:26:39.652 "name": null, 00:26:39.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:39.652 "is_configured": false, 00:26:39.652 "data_offset": 2048, 00:26:39.652 "data_size": 63488 00:26:39.652 }, 00:26:39.652 { 00:26:39.652 "name": null, 00:26:39.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:39.652 "is_configured": false, 00:26:39.652 "data_offset": 2048, 00:26:39.652 "data_size": 63488 00:26:39.652 }, 00:26:39.652 { 00:26:39.652 "name": "BaseBdev3", 00:26:39.652 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:39.652 "is_configured": true, 00:26:39.652 "data_offset": 2048, 00:26:39.652 "data_size": 63488 00:26:39.652 }, 00:26:39.652 { 00:26:39.652 "name": "BaseBdev4", 00:26:39.652 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:39.652 "is_configured": true, 00:26:39.652 "data_offset": 2048, 00:26:39.652 "data_size": 63488 00:26:39.652 } 00:26:39.652 ] 00:26:39.652 }' 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:39.652 20:40:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:39.911 [2024-07-15 20:40:32.120185] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:39.911 [2024-07-15 20:40:32.120328] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:39.911 [2024-07-15 20:40:32.120345] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:39.911 request: 00:26:39.911 { 00:26:39.911 "base_bdev": "BaseBdev1", 00:26:39.911 "raid_bdev": "raid_bdev1", 00:26:39.911 "method": "bdev_raid_add_base_bdev", 00:26:39.911 "req_id": 1 00:26:39.911 } 00:26:39.912 Got JSON-RPC error response 00:26:39.912 response: 00:26:39.912 { 00:26:39.912 "code": -22, 00:26:39.912 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:39.912 } 00:26:39.912 20:40:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:26:39.912 20:40:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:39.912 20:40:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:39.912 20:40:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:39.912 20:40:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.883 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.141 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:41.141 "name": "raid_bdev1", 00:26:41.141 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:41.142 "strip_size_kb": 0, 00:26:41.142 "state": "online", 00:26:41.142 "raid_level": "raid1", 00:26:41.142 "superblock": true, 00:26:41.142 "num_base_bdevs": 4, 00:26:41.142 "num_base_bdevs_discovered": 2, 00:26:41.142 "num_base_bdevs_operational": 2, 00:26:41.142 "base_bdevs_list": [ 00:26:41.142 { 00:26:41.142 "name": null, 00:26:41.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.142 "is_configured": false, 00:26:41.142 "data_offset": 2048, 00:26:41.142 "data_size": 63488 00:26:41.142 }, 00:26:41.142 { 00:26:41.142 "name": null, 00:26:41.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.142 "is_configured": false, 00:26:41.142 "data_offset": 2048, 00:26:41.142 "data_size": 63488 00:26:41.142 }, 00:26:41.142 { 00:26:41.142 "name": "BaseBdev3", 00:26:41.142 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:41.142 "is_configured": true, 00:26:41.142 "data_offset": 2048, 00:26:41.142 "data_size": 63488 00:26:41.142 }, 00:26:41.142 { 00:26:41.142 "name": "BaseBdev4", 00:26:41.142 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:41.142 "is_configured": true, 00:26:41.142 "data_offset": 2048, 00:26:41.142 "data_size": 63488 00:26:41.142 } 00:26:41.142 ] 00:26:41.142 }' 00:26:41.142 20:40:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:41.142 20:40:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:41.710 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:41.710 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:41.710 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:41.710 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:41.710 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:41.710 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.710 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.970 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:41.970 "name": "raid_bdev1", 00:26:41.970 "uuid": "62627587-d038-443c-a350-58f4ebaa4cdb", 00:26:41.970 "strip_size_kb": 0, 00:26:41.970 "state": "online", 00:26:41.970 "raid_level": "raid1", 00:26:41.970 "superblock": true, 00:26:41.970 "num_base_bdevs": 4, 00:26:41.970 "num_base_bdevs_discovered": 2, 00:26:41.970 "num_base_bdevs_operational": 2, 00:26:41.970 "base_bdevs_list": [ 00:26:41.970 { 00:26:41.970 "name": null, 00:26:41.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.970 "is_configured": false, 00:26:41.970 "data_offset": 2048, 00:26:41.970 "data_size": 63488 00:26:41.970 }, 00:26:41.970 { 00:26:41.970 "name": null, 00:26:41.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.970 "is_configured": false, 00:26:41.970 "data_offset": 2048, 00:26:41.970 "data_size": 63488 00:26:41.970 }, 00:26:41.970 { 00:26:41.970 "name": "BaseBdev3", 00:26:41.970 "uuid": "49ee1eab-63f6-5820-848a-6c054f2eafdb", 00:26:41.970 "is_configured": true, 00:26:41.970 "data_offset": 2048, 00:26:41.970 "data_size": 63488 00:26:41.970 }, 00:26:41.970 { 00:26:41.970 "name": "BaseBdev4", 00:26:41.970 "uuid": "07437687-d447-5c6a-98b6-41792065394e", 00:26:41.970 "is_configured": true, 00:26:41.970 "data_offset": 2048, 00:26:41.970 "data_size": 63488 00:26:41.970 } 00:26:41.970 ] 00:26:41.970 }' 00:26:41.970 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1480190 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1480190 ']' 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 1480190 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1480190 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1480190' 00:26:42.229 killing process with pid 1480190 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 1480190 00:26:42.229 Received shutdown signal, test time was about 60.000000 seconds 00:26:42.229 00:26:42.229 Latency(us) 00:26:42.229 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:42.229 =================================================================================================================== 00:26:42.229 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:42.229 [2024-07-15 20:40:34.456604] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:42.229 [2024-07-15 20:40:34.456705] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:42.229 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 1480190 00:26:42.229 [2024-07-15 20:40:34.456770] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:42.229 [2024-07-15 20:40:34.456783] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a09ec0 name raid_bdev1, state offline 00:26:42.229 [2024-07-15 20:40:34.504945] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:26:42.489 00:26:42.489 real 0m39.386s 00:26:42.489 user 0m56.465s 00:26:42.489 sys 0m7.520s 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:42.489 ************************************ 00:26:42.489 END TEST raid_rebuild_test_sb 00:26:42.489 ************************************ 00:26:42.489 20:40:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:42.489 20:40:34 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:26:42.489 20:40:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:42.489 20:40:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:42.489 20:40:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:42.489 ************************************ 00:26:42.489 START TEST raid_rebuild_test_io 00:26:42.489 ************************************ 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1485598 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1485598 /var/tmp/spdk-raid.sock 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 1485598 ']' 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:42.489 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:42.489 20:40:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:42.748 [2024-07-15 20:40:34.893276] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:26:42.748 [2024-07-15 20:40:34.893350] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1485598 ] 00:26:42.748 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:42.748 Zero copy mechanism will not be used. 00:26:42.748 [2024-07-15 20:40:35.023554] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:43.007 [2024-07-15 20:40:35.131481] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:43.007 [2024-07-15 20:40:35.193772] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:43.007 [2024-07-15 20:40:35.193802] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:43.574 20:40:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:43.574 20:40:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:26:43.574 20:40:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:43.574 20:40:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:43.833 BaseBdev1_malloc 00:26:43.833 20:40:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:44.091 [2024-07-15 20:40:36.246315] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:44.091 [2024-07-15 20:40:36.246362] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:44.091 [2024-07-15 20:40:36.246386] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12c0d40 00:26:44.091 [2024-07-15 20:40:36.246398] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:44.091 [2024-07-15 20:40:36.248152] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:44.091 [2024-07-15 20:40:36.248181] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:44.091 BaseBdev1 00:26:44.091 20:40:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:44.091 20:40:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:44.350 BaseBdev2_malloc 00:26:44.350 20:40:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:44.608 [2024-07-15 20:40:36.744525] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:44.608 [2024-07-15 20:40:36.744571] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:44.609 [2024-07-15 20:40:36.744594] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12c1860 00:26:44.609 [2024-07-15 20:40:36.744606] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:44.609 [2024-07-15 20:40:36.746000] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:44.609 [2024-07-15 20:40:36.746029] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:44.609 BaseBdev2 00:26:44.609 20:40:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:44.609 20:40:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:44.609 BaseBdev3_malloc 00:26:44.609 20:40:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:44.867 [2024-07-15 20:40:37.182235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:44.867 [2024-07-15 20:40:37.182280] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:44.867 [2024-07-15 20:40:37.182300] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x146e8f0 00:26:44.867 [2024-07-15 20:40:37.182312] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:44.867 [2024-07-15 20:40:37.183694] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:44.867 [2024-07-15 20:40:37.183721] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:44.867 BaseBdev3 00:26:44.867 20:40:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:44.867 20:40:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:45.125 BaseBdev4_malloc 00:26:45.125 20:40:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:45.384 [2024-07-15 20:40:37.680175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:45.384 [2024-07-15 20:40:37.680220] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:45.384 [2024-07-15 20:40:37.680240] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x146dad0 00:26:45.384 [2024-07-15 20:40:37.680252] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:45.384 [2024-07-15 20:40:37.681629] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:45.384 [2024-07-15 20:40:37.681658] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:45.384 BaseBdev4 00:26:45.384 20:40:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:45.643 spare_malloc 00:26:45.643 20:40:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:45.902 spare_delay 00:26:45.902 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:46.160 [2024-07-15 20:40:38.438801] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:46.160 [2024-07-15 20:40:38.438846] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:46.160 [2024-07-15 20:40:38.438866] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14725b0 00:26:46.160 [2024-07-15 20:40:38.438879] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:46.160 [2024-07-15 20:40:38.440326] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:46.160 [2024-07-15 20:40:38.440354] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:46.160 spare 00:26:46.160 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:46.418 [2024-07-15 20:40:38.683469] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:46.418 [2024-07-15 20:40:38.684709] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:46.418 [2024-07-15 20:40:38.684763] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:46.418 [2024-07-15 20:40:38.684808] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:46.418 [2024-07-15 20:40:38.684889] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13f18a0 00:26:46.418 [2024-07-15 20:40:38.684899] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:46.418 [2024-07-15 20:40:38.685116] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x146be10 00:26:46.418 [2024-07-15 20:40:38.685266] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13f18a0 00:26:46.418 [2024-07-15 20:40:38.685277] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13f18a0 00:26:46.418 [2024-07-15 20:40:38.685385] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:46.418 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.419 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.676 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:46.676 "name": "raid_bdev1", 00:26:46.676 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:46.676 "strip_size_kb": 0, 00:26:46.676 "state": "online", 00:26:46.676 "raid_level": "raid1", 00:26:46.676 "superblock": false, 00:26:46.676 "num_base_bdevs": 4, 00:26:46.676 "num_base_bdevs_discovered": 4, 00:26:46.676 "num_base_bdevs_operational": 4, 00:26:46.676 "base_bdevs_list": [ 00:26:46.676 { 00:26:46.676 "name": "BaseBdev1", 00:26:46.676 "uuid": "f8990144-67c2-5bd1-bbdc-570d1511d44b", 00:26:46.676 "is_configured": true, 00:26:46.676 "data_offset": 0, 00:26:46.676 "data_size": 65536 00:26:46.676 }, 00:26:46.676 { 00:26:46.676 "name": "BaseBdev2", 00:26:46.676 "uuid": "b283f718-d8d9-56a8-b247-d3fb7fc606b3", 00:26:46.676 "is_configured": true, 00:26:46.676 "data_offset": 0, 00:26:46.676 "data_size": 65536 00:26:46.676 }, 00:26:46.676 { 00:26:46.676 "name": "BaseBdev3", 00:26:46.676 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:46.676 "is_configured": true, 00:26:46.676 "data_offset": 0, 00:26:46.676 "data_size": 65536 00:26:46.676 }, 00:26:46.676 { 00:26:46.676 "name": "BaseBdev4", 00:26:46.676 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:46.676 "is_configured": true, 00:26:46.676 "data_offset": 0, 00:26:46.676 "data_size": 65536 00:26:46.676 } 00:26:46.676 ] 00:26:46.676 }' 00:26:46.676 20:40:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:46.676 20:40:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:47.243 20:40:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:47.243 20:40:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:47.502 [2024-07-15 20:40:39.646303] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:47.502 20:40:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:26:47.502 20:40:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.502 20:40:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:47.502 20:40:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:26:47.502 20:40:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:47.502 20:40:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:47.502 20:40:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:47.761 [2024-07-15 20:40:39.936867] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13f7970 00:26:47.761 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:47.761 Zero copy mechanism will not be used. 00:26:47.761 Running I/O for 60 seconds... 00:26:47.761 [2024-07-15 20:40:40.080029] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:47.761 [2024-07-15 20:40:40.088253] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x13f7970 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.761 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.330 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:48.330 "name": "raid_bdev1", 00:26:48.330 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:48.330 "strip_size_kb": 0, 00:26:48.330 "state": "online", 00:26:48.330 "raid_level": "raid1", 00:26:48.330 "superblock": false, 00:26:48.330 "num_base_bdevs": 4, 00:26:48.330 "num_base_bdevs_discovered": 3, 00:26:48.330 "num_base_bdevs_operational": 3, 00:26:48.330 "base_bdevs_list": [ 00:26:48.330 { 00:26:48.330 "name": null, 00:26:48.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.330 "is_configured": false, 00:26:48.330 "data_offset": 0, 00:26:48.330 "data_size": 65536 00:26:48.330 }, 00:26:48.330 { 00:26:48.330 "name": "BaseBdev2", 00:26:48.330 "uuid": "b283f718-d8d9-56a8-b247-d3fb7fc606b3", 00:26:48.330 "is_configured": true, 00:26:48.330 "data_offset": 0, 00:26:48.330 "data_size": 65536 00:26:48.330 }, 00:26:48.330 { 00:26:48.330 "name": "BaseBdev3", 00:26:48.330 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:48.330 "is_configured": true, 00:26:48.330 "data_offset": 0, 00:26:48.330 "data_size": 65536 00:26:48.330 }, 00:26:48.330 { 00:26:48.330 "name": "BaseBdev4", 00:26:48.330 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:48.330 "is_configured": true, 00:26:48.330 "data_offset": 0, 00:26:48.330 "data_size": 65536 00:26:48.330 } 00:26:48.330 ] 00:26:48.330 }' 00:26:48.330 20:40:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:48.330 20:40:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:48.897 20:40:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:48.898 [2024-07-15 20:40:41.208003] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:48.898 20:40:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:48.898 [2024-07-15 20:40:41.273552] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc7fa0 00:26:48.898 [2024-07-15 20:40:41.275974] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:49.156 [2024-07-15 20:40:41.404798] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:49.156 [2024-07-15 20:40:41.406065] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:49.415 [2024-07-15 20:40:41.626634] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:49.415 [2024-07-15 20:40:41.627229] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:49.674 [2024-07-15 20:40:41.981634] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:49.933 [2024-07-15 20:40:42.215062] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:49.933 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:49.933 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:49.933 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:49.933 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:49.933 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:49.933 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.933 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.192 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:50.192 "name": "raid_bdev1", 00:26:50.192 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:50.192 "strip_size_kb": 0, 00:26:50.192 "state": "online", 00:26:50.192 "raid_level": "raid1", 00:26:50.192 "superblock": false, 00:26:50.192 "num_base_bdevs": 4, 00:26:50.192 "num_base_bdevs_discovered": 4, 00:26:50.192 "num_base_bdevs_operational": 4, 00:26:50.192 "process": { 00:26:50.192 "type": "rebuild", 00:26:50.192 "target": "spare", 00:26:50.192 "progress": { 00:26:50.192 "blocks": 12288, 00:26:50.192 "percent": 18 00:26:50.192 } 00:26:50.192 }, 00:26:50.192 "base_bdevs_list": [ 00:26:50.192 { 00:26:50.192 "name": "spare", 00:26:50.192 "uuid": "5c01eb47-70e8-5fb9-9e08-4146d265f22e", 00:26:50.192 "is_configured": true, 00:26:50.192 "data_offset": 0, 00:26:50.192 "data_size": 65536 00:26:50.192 }, 00:26:50.192 { 00:26:50.192 "name": "BaseBdev2", 00:26:50.192 "uuid": "b283f718-d8d9-56a8-b247-d3fb7fc606b3", 00:26:50.192 "is_configured": true, 00:26:50.192 "data_offset": 0, 00:26:50.192 "data_size": 65536 00:26:50.192 }, 00:26:50.192 { 00:26:50.192 "name": "BaseBdev3", 00:26:50.192 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:50.192 "is_configured": true, 00:26:50.192 "data_offset": 0, 00:26:50.192 "data_size": 65536 00:26:50.192 }, 00:26:50.192 { 00:26:50.192 "name": "BaseBdev4", 00:26:50.192 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:50.192 "is_configured": true, 00:26:50.192 "data_offset": 0, 00:26:50.192 "data_size": 65536 00:26:50.192 } 00:26:50.192 ] 00:26:50.192 }' 00:26:50.192 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:50.192 [2024-07-15 20:40:42.475088] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:50.192 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:50.192 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:50.192 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:50.192 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:50.450 [2024-07-15 20:40:42.671057] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:50.450 [2024-07-15 20:40:42.687486] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:50.450 [2024-07-15 20:40:42.687653] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:50.451 [2024-07-15 20:40:42.798186] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:50.451 [2024-07-15 20:40:42.812161] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:50.451 [2024-07-15 20:40:42.812198] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:50.451 [2024-07-15 20:40:42.812210] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:50.709 [2024-07-15 20:40:42.844729] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x13f7970 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.709 20:40:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.709 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:50.709 "name": "raid_bdev1", 00:26:50.709 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:50.709 "strip_size_kb": 0, 00:26:50.709 "state": "online", 00:26:50.709 "raid_level": "raid1", 00:26:50.709 "superblock": false, 00:26:50.709 "num_base_bdevs": 4, 00:26:50.709 "num_base_bdevs_discovered": 3, 00:26:50.709 "num_base_bdevs_operational": 3, 00:26:50.709 "base_bdevs_list": [ 00:26:50.709 { 00:26:50.709 "name": null, 00:26:50.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.709 "is_configured": false, 00:26:50.709 "data_offset": 0, 00:26:50.709 "data_size": 65536 00:26:50.709 }, 00:26:50.709 { 00:26:50.709 "name": "BaseBdev2", 00:26:50.709 "uuid": "b283f718-d8d9-56a8-b247-d3fb7fc606b3", 00:26:50.709 "is_configured": true, 00:26:50.709 "data_offset": 0, 00:26:50.709 "data_size": 65536 00:26:50.709 }, 00:26:50.709 { 00:26:50.709 "name": "BaseBdev3", 00:26:50.709 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:50.709 "is_configured": true, 00:26:50.709 "data_offset": 0, 00:26:50.709 "data_size": 65536 00:26:50.709 }, 00:26:50.709 { 00:26:50.709 "name": "BaseBdev4", 00:26:50.709 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:50.709 "is_configured": true, 00:26:50.709 "data_offset": 0, 00:26:50.709 "data_size": 65536 00:26:50.709 } 00:26:50.709 ] 00:26:50.709 }' 00:26:50.709 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:50.709 20:40:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:51.644 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:51.644 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:51.644 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:51.644 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:51.644 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:51.644 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.644 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.644 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:51.644 "name": "raid_bdev1", 00:26:51.644 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:51.644 "strip_size_kb": 0, 00:26:51.644 "state": "online", 00:26:51.644 "raid_level": "raid1", 00:26:51.644 "superblock": false, 00:26:51.644 "num_base_bdevs": 4, 00:26:51.644 "num_base_bdevs_discovered": 3, 00:26:51.644 "num_base_bdevs_operational": 3, 00:26:51.644 "base_bdevs_list": [ 00:26:51.644 { 00:26:51.644 "name": null, 00:26:51.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:51.644 "is_configured": false, 00:26:51.644 "data_offset": 0, 00:26:51.644 "data_size": 65536 00:26:51.644 }, 00:26:51.644 { 00:26:51.644 "name": "BaseBdev2", 00:26:51.644 "uuid": "b283f718-d8d9-56a8-b247-d3fb7fc606b3", 00:26:51.644 "is_configured": true, 00:26:51.644 "data_offset": 0, 00:26:51.644 "data_size": 65536 00:26:51.644 }, 00:26:51.644 { 00:26:51.644 "name": "BaseBdev3", 00:26:51.644 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:51.644 "is_configured": true, 00:26:51.644 "data_offset": 0, 00:26:51.644 "data_size": 65536 00:26:51.644 }, 00:26:51.644 { 00:26:51.644 "name": "BaseBdev4", 00:26:51.644 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:51.644 "is_configured": true, 00:26:51.644 "data_offset": 0, 00:26:51.644 "data_size": 65536 00:26:51.644 } 00:26:51.644 ] 00:26:51.644 }' 00:26:51.645 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:51.645 20:40:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:51.645 20:40:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:51.902 20:40:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:51.902 20:40:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:52.161 [2024-07-15 20:40:44.535509] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:52.419 20:40:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:52.419 [2024-07-15 20:40:44.602779] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14658f0 00:26:52.419 [2024-07-15 20:40:44.604363] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:52.419 [2024-07-15 20:40:44.713407] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:52.419 [2024-07-15 20:40:44.713720] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:52.678 [2024-07-15 20:40:44.928138] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:52.678 [2024-07-15 20:40:44.928811] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:53.244 [2024-07-15 20:40:45.386194] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:53.244 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:53.244 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:53.244 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:53.244 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:53.244 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:53.244 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.244 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.503 [2024-07-15 20:40:45.770470] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:53.503 [2024-07-15 20:40:45.770695] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:53.503 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:53.503 "name": "raid_bdev1", 00:26:53.503 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:53.503 "strip_size_kb": 0, 00:26:53.503 "state": "online", 00:26:53.503 "raid_level": "raid1", 00:26:53.503 "superblock": false, 00:26:53.503 "num_base_bdevs": 4, 00:26:53.503 "num_base_bdevs_discovered": 4, 00:26:53.503 "num_base_bdevs_operational": 4, 00:26:53.503 "process": { 00:26:53.503 "type": "rebuild", 00:26:53.503 "target": "spare", 00:26:53.503 "progress": { 00:26:53.503 "blocks": 16384, 00:26:53.503 "percent": 25 00:26:53.503 } 00:26:53.503 }, 00:26:53.503 "base_bdevs_list": [ 00:26:53.503 { 00:26:53.503 "name": "spare", 00:26:53.503 "uuid": "5c01eb47-70e8-5fb9-9e08-4146d265f22e", 00:26:53.503 "is_configured": true, 00:26:53.503 "data_offset": 0, 00:26:53.503 "data_size": 65536 00:26:53.503 }, 00:26:53.503 { 00:26:53.503 "name": "BaseBdev2", 00:26:53.503 "uuid": "b283f718-d8d9-56a8-b247-d3fb7fc606b3", 00:26:53.503 "is_configured": true, 00:26:53.503 "data_offset": 0, 00:26:53.503 "data_size": 65536 00:26:53.503 }, 00:26:53.503 { 00:26:53.503 "name": "BaseBdev3", 00:26:53.503 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:53.503 "is_configured": true, 00:26:53.503 "data_offset": 0, 00:26:53.503 "data_size": 65536 00:26:53.503 }, 00:26:53.503 { 00:26:53.503 "name": "BaseBdev4", 00:26:53.503 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:53.503 "is_configured": true, 00:26:53.503 "data_offset": 0, 00:26:53.503 "data_size": 65536 00:26:53.503 } 00:26:53.503 ] 00:26:53.503 }' 00:26:53.503 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:53.762 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:53.762 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:53.762 [2024-07-15 20:40:45.984873] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:53.762 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:53.762 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:26:53.762 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:53.762 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:53.762 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:53.762 20:40:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:54.019 [2024-07-15 20:40:46.199165] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:54.019 [2024-07-15 20:40:46.199406] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:54.019 [2024-07-15 20:40:46.222138] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:54.277 [2024-07-15 20:40:46.526200] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x13f7970 00:26:54.277 [2024-07-15 20:40:46.526235] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x14658f0 00:26:54.277 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:54.277 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:54.277 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:54.277 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:54.277 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:54.277 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:54.277 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:54.277 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:54.277 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.534 [2024-07-15 20:40:46.803567] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:54.534 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:54.535 "name": "raid_bdev1", 00:26:54.535 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:54.535 "strip_size_kb": 0, 00:26:54.535 "state": "online", 00:26:54.535 "raid_level": "raid1", 00:26:54.535 "superblock": false, 00:26:54.535 "num_base_bdevs": 4, 00:26:54.535 "num_base_bdevs_discovered": 3, 00:26:54.535 "num_base_bdevs_operational": 3, 00:26:54.535 "process": { 00:26:54.535 "type": "rebuild", 00:26:54.535 "target": "spare", 00:26:54.535 "progress": { 00:26:54.535 "blocks": 26624, 00:26:54.535 "percent": 40 00:26:54.535 } 00:26:54.535 }, 00:26:54.535 "base_bdevs_list": [ 00:26:54.535 { 00:26:54.535 "name": "spare", 00:26:54.535 "uuid": "5c01eb47-70e8-5fb9-9e08-4146d265f22e", 00:26:54.535 "is_configured": true, 00:26:54.535 "data_offset": 0, 00:26:54.535 "data_size": 65536 00:26:54.535 }, 00:26:54.535 { 00:26:54.535 "name": null, 00:26:54.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:54.535 "is_configured": false, 00:26:54.535 "data_offset": 0, 00:26:54.535 "data_size": 65536 00:26:54.535 }, 00:26:54.535 { 00:26:54.535 "name": "BaseBdev3", 00:26:54.535 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:54.535 "is_configured": true, 00:26:54.535 "data_offset": 0, 00:26:54.535 "data_size": 65536 00:26:54.535 }, 00:26:54.535 { 00:26:54.535 "name": "BaseBdev4", 00:26:54.535 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:54.535 "is_configured": true, 00:26:54.535 "data_offset": 0, 00:26:54.535 "data_size": 65536 00:26:54.535 } 00:26:54.535 ] 00:26:54.535 }' 00:26:54.535 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:54.535 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:54.535 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:54.792 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:54.792 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=984 00:26:54.792 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:54.792 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:54.792 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:54.792 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:54.792 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:54.792 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:54.792 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:54.792 20:40:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.792 20:40:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:54.792 "name": "raid_bdev1", 00:26:54.792 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:54.792 "strip_size_kb": 0, 00:26:54.792 "state": "online", 00:26:54.792 "raid_level": "raid1", 00:26:54.792 "superblock": false, 00:26:54.792 "num_base_bdevs": 4, 00:26:54.792 "num_base_bdevs_discovered": 3, 00:26:54.792 "num_base_bdevs_operational": 3, 00:26:54.792 "process": { 00:26:54.792 "type": "rebuild", 00:26:54.792 "target": "spare", 00:26:54.792 "progress": { 00:26:54.792 "blocks": 30720, 00:26:54.792 "percent": 46 00:26:54.792 } 00:26:54.792 }, 00:26:54.792 "base_bdevs_list": [ 00:26:54.792 { 00:26:54.792 "name": "spare", 00:26:54.792 "uuid": "5c01eb47-70e8-5fb9-9e08-4146d265f22e", 00:26:54.792 "is_configured": true, 00:26:54.792 "data_offset": 0, 00:26:54.792 "data_size": 65536 00:26:54.792 }, 00:26:54.792 { 00:26:54.792 "name": null, 00:26:54.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:54.792 "is_configured": false, 00:26:54.792 "data_offset": 0, 00:26:54.792 "data_size": 65536 00:26:54.792 }, 00:26:54.792 { 00:26:54.792 "name": "BaseBdev3", 00:26:54.792 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:54.792 "is_configured": true, 00:26:54.792 "data_offset": 0, 00:26:54.792 "data_size": 65536 00:26:54.792 }, 00:26:54.792 { 00:26:54.792 "name": "BaseBdev4", 00:26:54.792 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:54.792 "is_configured": true, 00:26:54.792 "data_offset": 0, 00:26:54.792 "data_size": 65536 00:26:54.792 } 00:26:54.792 ] 00:26:54.792 }' 00:26:54.792 20:40:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:54.792 20:40:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:54.792 20:40:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:55.050 20:40:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:55.050 20:40:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:55.308 [2024-07-15 20:40:47.538210] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:55.875 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:55.875 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:55.875 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:55.875 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:55.875 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:55.876 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:55.876 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.876 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:55.876 [2024-07-15 20:40:48.238231] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:56.135 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:56.135 "name": "raid_bdev1", 00:26:56.135 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:56.135 "strip_size_kb": 0, 00:26:56.135 "state": "online", 00:26:56.135 "raid_level": "raid1", 00:26:56.135 "superblock": false, 00:26:56.135 "num_base_bdevs": 4, 00:26:56.135 "num_base_bdevs_discovered": 3, 00:26:56.135 "num_base_bdevs_operational": 3, 00:26:56.135 "process": { 00:26:56.135 "type": "rebuild", 00:26:56.135 "target": "spare", 00:26:56.135 "progress": { 00:26:56.135 "blocks": 51200, 00:26:56.135 "percent": 78 00:26:56.135 } 00:26:56.135 }, 00:26:56.135 "base_bdevs_list": [ 00:26:56.135 { 00:26:56.135 "name": "spare", 00:26:56.135 "uuid": "5c01eb47-70e8-5fb9-9e08-4146d265f22e", 00:26:56.135 "is_configured": true, 00:26:56.135 "data_offset": 0, 00:26:56.135 "data_size": 65536 00:26:56.135 }, 00:26:56.135 { 00:26:56.135 "name": null, 00:26:56.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:56.135 "is_configured": false, 00:26:56.135 "data_offset": 0, 00:26:56.135 "data_size": 65536 00:26:56.135 }, 00:26:56.135 { 00:26:56.135 "name": "BaseBdev3", 00:26:56.135 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:56.135 "is_configured": true, 00:26:56.135 "data_offset": 0, 00:26:56.135 "data_size": 65536 00:26:56.135 }, 00:26:56.135 { 00:26:56.135 "name": "BaseBdev4", 00:26:56.135 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:56.135 "is_configured": true, 00:26:56.135 "data_offset": 0, 00:26:56.135 "data_size": 65536 00:26:56.135 } 00:26:56.135 ] 00:26:56.135 }' 00:26:56.135 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:56.135 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:56.135 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:56.135 [2024-07-15 20:40:48.451274] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:56.135 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:56.135 20:40:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:57.104 [2024-07-15 20:40:49.259096] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:57.104 [2024-07-15 20:40:49.359325] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:57.104 [2024-07-15 20:40:49.361691] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:57.104 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:57.104 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:57.104 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:57.104 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:57.104 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:57.104 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:57.104 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.104 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:57.363 "name": "raid_bdev1", 00:26:57.363 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:57.363 "strip_size_kb": 0, 00:26:57.363 "state": "online", 00:26:57.363 "raid_level": "raid1", 00:26:57.363 "superblock": false, 00:26:57.363 "num_base_bdevs": 4, 00:26:57.363 "num_base_bdevs_discovered": 3, 00:26:57.363 "num_base_bdevs_operational": 3, 00:26:57.363 "base_bdevs_list": [ 00:26:57.363 { 00:26:57.363 "name": "spare", 00:26:57.363 "uuid": "5c01eb47-70e8-5fb9-9e08-4146d265f22e", 00:26:57.363 "is_configured": true, 00:26:57.363 "data_offset": 0, 00:26:57.363 "data_size": 65536 00:26:57.363 }, 00:26:57.363 { 00:26:57.363 "name": null, 00:26:57.363 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.363 "is_configured": false, 00:26:57.363 "data_offset": 0, 00:26:57.363 "data_size": 65536 00:26:57.363 }, 00:26:57.363 { 00:26:57.363 "name": "BaseBdev3", 00:26:57.363 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:57.363 "is_configured": true, 00:26:57.363 "data_offset": 0, 00:26:57.363 "data_size": 65536 00:26:57.363 }, 00:26:57.363 { 00:26:57.363 "name": "BaseBdev4", 00:26:57.363 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:57.363 "is_configured": true, 00:26:57.363 "data_offset": 0, 00:26:57.363 "data_size": 65536 00:26:57.363 } 00:26:57.363 ] 00:26:57.363 }' 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.363 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.622 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:57.622 "name": "raid_bdev1", 00:26:57.622 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:57.622 "strip_size_kb": 0, 00:26:57.622 "state": "online", 00:26:57.622 "raid_level": "raid1", 00:26:57.622 "superblock": false, 00:26:57.622 "num_base_bdevs": 4, 00:26:57.622 "num_base_bdevs_discovered": 3, 00:26:57.622 "num_base_bdevs_operational": 3, 00:26:57.622 "base_bdevs_list": [ 00:26:57.622 { 00:26:57.622 "name": "spare", 00:26:57.622 "uuid": "5c01eb47-70e8-5fb9-9e08-4146d265f22e", 00:26:57.622 "is_configured": true, 00:26:57.622 "data_offset": 0, 00:26:57.622 "data_size": 65536 00:26:57.622 }, 00:26:57.622 { 00:26:57.622 "name": null, 00:26:57.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.622 "is_configured": false, 00:26:57.622 "data_offset": 0, 00:26:57.622 "data_size": 65536 00:26:57.622 }, 00:26:57.622 { 00:26:57.622 "name": "BaseBdev3", 00:26:57.622 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:57.622 "is_configured": true, 00:26:57.622 "data_offset": 0, 00:26:57.622 "data_size": 65536 00:26:57.622 }, 00:26:57.622 { 00:26:57.622 "name": "BaseBdev4", 00:26:57.622 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:57.622 "is_configured": true, 00:26:57.622 "data_offset": 0, 00:26:57.622 "data_size": 65536 00:26:57.622 } 00:26:57.622 ] 00:26:57.622 }' 00:26:57.622 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:57.622 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:57.622 20:40:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:57.880 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:57.880 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:57.880 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:57.880 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:57.880 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:57.880 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:57.880 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:57.880 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:57.880 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:57.880 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:57.881 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:57.881 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.881 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.140 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:58.140 "name": "raid_bdev1", 00:26:58.140 "uuid": "0898e87d-be84-4312-b32b-82353db450ab", 00:26:58.140 "strip_size_kb": 0, 00:26:58.140 "state": "online", 00:26:58.140 "raid_level": "raid1", 00:26:58.140 "superblock": false, 00:26:58.140 "num_base_bdevs": 4, 00:26:58.140 "num_base_bdevs_discovered": 3, 00:26:58.140 "num_base_bdevs_operational": 3, 00:26:58.140 "base_bdevs_list": [ 00:26:58.140 { 00:26:58.140 "name": "spare", 00:26:58.140 "uuid": "5c01eb47-70e8-5fb9-9e08-4146d265f22e", 00:26:58.140 "is_configured": true, 00:26:58.140 "data_offset": 0, 00:26:58.140 "data_size": 65536 00:26:58.140 }, 00:26:58.140 { 00:26:58.140 "name": null, 00:26:58.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.140 "is_configured": false, 00:26:58.140 "data_offset": 0, 00:26:58.140 "data_size": 65536 00:26:58.140 }, 00:26:58.140 { 00:26:58.140 "name": "BaseBdev3", 00:26:58.140 "uuid": "f44c9038-dd6a-5709-b0a1-c420354394b7", 00:26:58.140 "is_configured": true, 00:26:58.140 "data_offset": 0, 00:26:58.140 "data_size": 65536 00:26:58.140 }, 00:26:58.140 { 00:26:58.140 "name": "BaseBdev4", 00:26:58.140 "uuid": "997d3376-a1fc-5135-880a-3b4245f75e2b", 00:26:58.140 "is_configured": true, 00:26:58.140 "data_offset": 0, 00:26:58.140 "data_size": 65536 00:26:58.140 } 00:26:58.140 ] 00:26:58.140 }' 00:26:58.140 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:58.140 20:40:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:58.708 20:40:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:58.966 [2024-07-15 20:40:51.125213] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:58.966 [2024-07-15 20:40:51.125250] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:58.966 00:26:58.966 Latency(us) 00:26:58.966 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:58.966 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:58.966 raid_bdev1 : 11.26 92.37 277.12 0.00 0.00 14787.35 279.60 123093.70 00:26:58.966 =================================================================================================================== 00:26:58.966 Total : 92.37 277.12 0.00 0.00 14787.35 279.60 123093.70 00:26:58.966 [2024-07-15 20:40:51.229456] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:58.966 [2024-07-15 20:40:51.229488] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:58.966 [2024-07-15 20:40:51.229584] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:58.966 [2024-07-15 20:40:51.229596] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13f18a0 name raid_bdev1, state offline 00:26:58.966 0 00:26:58.966 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.966 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:59.225 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:59.793 /dev/nbd0 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:59.793 1+0 records in 00:26:59.793 1+0 records out 00:26:59.793 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271307 s, 15.1 MB/s 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:59.793 20:40:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:59.793 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:59.793 /dev/nbd1 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:00.052 1+0 records in 00:27:00.052 1+0 records out 00:27:00.052 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252477 s, 16.2 MB/s 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:00.052 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:00.311 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:27:00.570 /dev/nbd1 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:00.570 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:00.570 1+0 records in 00:27:00.570 1+0 records out 00:27:00.570 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319427 s, 12.8 MB/s 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:00.571 20:40:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:00.829 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:00.830 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:00.830 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1485598 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 1485598 ']' 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 1485598 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1485598 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1485598' 00:27:01.088 killing process with pid 1485598 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 1485598 00:27:01.088 Received shutdown signal, test time was about 13.412166 seconds 00:27:01.088 00:27:01.088 Latency(us) 00:27:01.088 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:01.088 =================================================================================================================== 00:27:01.088 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:01.088 [2024-07-15 20:40:53.384201] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:01.088 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 1485598 00:27:01.088 [2024-07-15 20:40:53.427797] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:01.347 20:40:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:27:01.347 00:27:01.347 real 0m18.841s 00:27:01.347 user 0m29.002s 00:27:01.347 sys 0m3.349s 00:27:01.347 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:01.347 20:40:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:01.347 ************************************ 00:27:01.347 END TEST raid_rebuild_test_io 00:27:01.347 ************************************ 00:27:01.347 20:40:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:01.347 20:40:53 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:27:01.347 20:40:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:01.347 20:40:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:01.347 20:40:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:01.605 ************************************ 00:27:01.605 START TEST raid_rebuild_test_sb_io 00:27:01.605 ************************************ 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1488256 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1488256 /var/tmp/spdk-raid.sock 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:01.605 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 1488256 ']' 00:27:01.606 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:01.606 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:01.606 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:01.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:01.606 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:01.606 20:40:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:01.606 [2024-07-15 20:40:53.815554] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:27:01.606 [2024-07-15 20:40:53.815602] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1488256 ] 00:27:01.606 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:01.606 Zero copy mechanism will not be used. 00:27:01.606 [2024-07-15 20:40:53.918432] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:01.864 [2024-07-15 20:40:54.020884] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:01.864 [2024-07-15 20:40:54.074475] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:01.864 [2024-07-15 20:40:54.074512] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:01.864 20:40:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:01.864 20:40:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:27:01.864 20:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:01.864 20:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:02.122 BaseBdev1_malloc 00:27:02.122 20:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:02.380 [2024-07-15 20:40:54.554587] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:02.380 [2024-07-15 20:40:54.554636] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.380 [2024-07-15 20:40:54.554658] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f6ad40 00:27:02.380 [2024-07-15 20:40:54.554670] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.380 [2024-07-15 20:40:54.556244] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.380 [2024-07-15 20:40:54.556273] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:02.380 BaseBdev1 00:27:02.380 20:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:02.380 20:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:02.380 BaseBdev2_malloc 00:27:02.381 20:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:02.639 [2024-07-15 20:40:54.900484] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:02.639 [2024-07-15 20:40:54.900529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.639 [2024-07-15 20:40:54.900552] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f6b860 00:27:02.639 [2024-07-15 20:40:54.900565] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.639 [2024-07-15 20:40:54.902019] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.639 [2024-07-15 20:40:54.902046] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:02.639 BaseBdev2 00:27:02.639 20:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:02.639 20:40:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:02.898 BaseBdev3_malloc 00:27:02.898 20:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:27:02.898 [2024-07-15 20:40:55.242029] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:27:02.898 [2024-07-15 20:40:55.242076] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.898 [2024-07-15 20:40:55.242095] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21188f0 00:27:02.898 [2024-07-15 20:40:55.242113] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.898 [2024-07-15 20:40:55.243481] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.898 [2024-07-15 20:40:55.243508] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:02.898 BaseBdev3 00:27:02.898 20:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:02.898 20:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:03.157 BaseBdev4_malloc 00:27:03.157 20:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:27:03.416 [2024-07-15 20:40:55.599582] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:27:03.416 [2024-07-15 20:40:55.599625] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.416 [2024-07-15 20:40:55.599643] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2117ad0 00:27:03.416 [2024-07-15 20:40:55.599656] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.416 [2024-07-15 20:40:55.601025] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.416 [2024-07-15 20:40:55.601052] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:03.416 BaseBdev4 00:27:03.416 20:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:03.674 spare_malloc 00:27:03.674 20:40:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:03.674 spare_delay 00:27:03.933 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:03.933 [2024-07-15 20:40:56.209816] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:03.933 [2024-07-15 20:40:56.209865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.933 [2024-07-15 20:40:56.209886] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x211c5b0 00:27:03.933 [2024-07-15 20:40:56.209898] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.933 [2024-07-15 20:40:56.211373] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.933 [2024-07-15 20:40:56.211403] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:03.933 spare 00:27:03.933 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:27:04.191 [2024-07-15 20:40:56.386320] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:04.191 [2024-07-15 20:40:56.387508] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:04.191 [2024-07-15 20:40:56.387561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:04.191 [2024-07-15 20:40:56.387606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:04.191 [2024-07-15 20:40:56.387801] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x209b8a0 00:27:04.191 [2024-07-15 20:40:56.387812] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:04.191 [2024-07-15 20:40:56.388017] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2115e10 00:27:04.191 [2024-07-15 20:40:56.388167] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x209b8a0 00:27:04.191 [2024-07-15 20:40:56.388177] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x209b8a0 00:27:04.191 [2024-07-15 20:40:56.388274] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:04.191 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.192 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.450 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:04.450 "name": "raid_bdev1", 00:27:04.450 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:04.450 "strip_size_kb": 0, 00:27:04.450 "state": "online", 00:27:04.450 "raid_level": "raid1", 00:27:04.450 "superblock": true, 00:27:04.450 "num_base_bdevs": 4, 00:27:04.450 "num_base_bdevs_discovered": 4, 00:27:04.451 "num_base_bdevs_operational": 4, 00:27:04.451 "base_bdevs_list": [ 00:27:04.451 { 00:27:04.451 "name": "BaseBdev1", 00:27:04.451 "uuid": "94c2ecf7-c00b-5f27-af58-d4d66ceb770b", 00:27:04.451 "is_configured": true, 00:27:04.451 "data_offset": 2048, 00:27:04.451 "data_size": 63488 00:27:04.451 }, 00:27:04.451 { 00:27:04.451 "name": "BaseBdev2", 00:27:04.451 "uuid": "ff15e32d-1926-57aa-9d48-b2dffa00cb04", 00:27:04.451 "is_configured": true, 00:27:04.451 "data_offset": 2048, 00:27:04.451 "data_size": 63488 00:27:04.451 }, 00:27:04.451 { 00:27:04.451 "name": "BaseBdev3", 00:27:04.451 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:04.451 "is_configured": true, 00:27:04.451 "data_offset": 2048, 00:27:04.451 "data_size": 63488 00:27:04.451 }, 00:27:04.451 { 00:27:04.451 "name": "BaseBdev4", 00:27:04.451 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:04.451 "is_configured": true, 00:27:04.451 "data_offset": 2048, 00:27:04.451 "data_size": 63488 00:27:04.451 } 00:27:04.451 ] 00:27:04.451 }' 00:27:04.451 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:04.451 20:40:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:05.018 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:05.018 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:05.277 [2024-07-15 20:40:57.417347] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:05.277 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:27:05.277 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.277 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:05.535 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:27:05.535 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:27:05.535 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:05.535 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:27:05.535 [2024-07-15 20:40:57.808172] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f6a670 00:27:05.535 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:05.535 Zero copy mechanism will not be used. 00:27:05.535 Running I/O for 60 seconds... 00:27:05.794 [2024-07-15 20:40:57.929440] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:05.794 [2024-07-15 20:40:57.945667] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1f6a670 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.794 20:40:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.052 20:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:06.052 "name": "raid_bdev1", 00:27:06.052 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:06.052 "strip_size_kb": 0, 00:27:06.052 "state": "online", 00:27:06.052 "raid_level": "raid1", 00:27:06.052 "superblock": true, 00:27:06.052 "num_base_bdevs": 4, 00:27:06.052 "num_base_bdevs_discovered": 3, 00:27:06.052 "num_base_bdevs_operational": 3, 00:27:06.052 "base_bdevs_list": [ 00:27:06.052 { 00:27:06.052 "name": null, 00:27:06.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:06.052 "is_configured": false, 00:27:06.052 "data_offset": 2048, 00:27:06.052 "data_size": 63488 00:27:06.052 }, 00:27:06.052 { 00:27:06.052 "name": "BaseBdev2", 00:27:06.052 "uuid": "ff15e32d-1926-57aa-9d48-b2dffa00cb04", 00:27:06.052 "is_configured": true, 00:27:06.052 "data_offset": 2048, 00:27:06.052 "data_size": 63488 00:27:06.052 }, 00:27:06.052 { 00:27:06.052 "name": "BaseBdev3", 00:27:06.052 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:06.052 "is_configured": true, 00:27:06.052 "data_offset": 2048, 00:27:06.052 "data_size": 63488 00:27:06.052 }, 00:27:06.052 { 00:27:06.052 "name": "BaseBdev4", 00:27:06.052 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:06.052 "is_configured": true, 00:27:06.052 "data_offset": 2048, 00:27:06.052 "data_size": 63488 00:27:06.052 } 00:27:06.052 ] 00:27:06.052 }' 00:27:06.052 20:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:06.052 20:40:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:06.620 20:40:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:06.879 [2024-07-15 20:40:59.038865] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:06.879 20:40:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:06.879 [2024-07-15 20:40:59.122117] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x209db40 00:27:06.879 [2024-07-15 20:40:59.124529] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:07.138 [2024-07-15 20:40:59.282021] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:07.138 [2024-07-15 20:40:59.282528] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:07.138 [2024-07-15 20:40:59.487461] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:07.138 [2024-07-15 20:40:59.488150] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:07.706 [2024-07-15 20:40:59.864295] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:07.706 [2024-07-15 20:41:00.005681] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:07.966 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:07.966 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:07.966 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:07.966 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:07.966 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:07.966 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.966 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:08.224 [2024-07-15 20:41:00.364406] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:08.224 [2024-07-15 20:41:00.365602] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:08.224 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:08.224 "name": "raid_bdev1", 00:27:08.224 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:08.224 "strip_size_kb": 0, 00:27:08.224 "state": "online", 00:27:08.224 "raid_level": "raid1", 00:27:08.224 "superblock": true, 00:27:08.224 "num_base_bdevs": 4, 00:27:08.224 "num_base_bdevs_discovered": 4, 00:27:08.224 "num_base_bdevs_operational": 4, 00:27:08.224 "process": { 00:27:08.224 "type": "rebuild", 00:27:08.224 "target": "spare", 00:27:08.224 "progress": { 00:27:08.224 "blocks": 12288, 00:27:08.224 "percent": 19 00:27:08.224 } 00:27:08.224 }, 00:27:08.224 "base_bdevs_list": [ 00:27:08.224 { 00:27:08.224 "name": "spare", 00:27:08.224 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:08.224 "is_configured": true, 00:27:08.224 "data_offset": 2048, 00:27:08.224 "data_size": 63488 00:27:08.224 }, 00:27:08.224 { 00:27:08.224 "name": "BaseBdev2", 00:27:08.224 "uuid": "ff15e32d-1926-57aa-9d48-b2dffa00cb04", 00:27:08.224 "is_configured": true, 00:27:08.224 "data_offset": 2048, 00:27:08.224 "data_size": 63488 00:27:08.224 }, 00:27:08.224 { 00:27:08.224 "name": "BaseBdev3", 00:27:08.224 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:08.224 "is_configured": true, 00:27:08.224 "data_offset": 2048, 00:27:08.224 "data_size": 63488 00:27:08.224 }, 00:27:08.224 { 00:27:08.224 "name": "BaseBdev4", 00:27:08.224 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:08.224 "is_configured": true, 00:27:08.224 "data_offset": 2048, 00:27:08.224 "data_size": 63488 00:27:08.224 } 00:27:08.224 ] 00:27:08.224 }' 00:27:08.224 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:08.224 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:08.224 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:08.224 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:08.224 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:08.224 [2024-07-15 20:41:00.602228] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:08.483 [2024-07-15 20:41:00.637564] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:08.483 [2024-07-15 20:41:00.738170] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:08.483 [2024-07-15 20:41:00.749399] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:08.483 [2024-07-15 20:41:00.749432] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:08.483 [2024-07-15 20:41:00.749444] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:08.483 [2024-07-15 20:41:00.781382] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1f6a670 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:08.483 20:41:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.741 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:08.741 "name": "raid_bdev1", 00:27:08.741 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:08.741 "strip_size_kb": 0, 00:27:08.741 "state": "online", 00:27:08.741 "raid_level": "raid1", 00:27:08.741 "superblock": true, 00:27:08.742 "num_base_bdevs": 4, 00:27:08.742 "num_base_bdevs_discovered": 3, 00:27:08.742 "num_base_bdevs_operational": 3, 00:27:08.742 "base_bdevs_list": [ 00:27:08.742 { 00:27:08.742 "name": null, 00:27:08.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.742 "is_configured": false, 00:27:08.742 "data_offset": 2048, 00:27:08.742 "data_size": 63488 00:27:08.742 }, 00:27:08.742 { 00:27:08.742 "name": "BaseBdev2", 00:27:08.742 "uuid": "ff15e32d-1926-57aa-9d48-b2dffa00cb04", 00:27:08.742 "is_configured": true, 00:27:08.742 "data_offset": 2048, 00:27:08.742 "data_size": 63488 00:27:08.742 }, 00:27:08.742 { 00:27:08.742 "name": "BaseBdev3", 00:27:08.742 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:08.742 "is_configured": true, 00:27:08.742 "data_offset": 2048, 00:27:08.742 "data_size": 63488 00:27:08.742 }, 00:27:08.742 { 00:27:08.742 "name": "BaseBdev4", 00:27:08.742 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:08.742 "is_configured": true, 00:27:08.742 "data_offset": 2048, 00:27:08.742 "data_size": 63488 00:27:08.742 } 00:27:08.742 ] 00:27:08.742 }' 00:27:08.742 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:08.742 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:09.309 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:09.309 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:09.309 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:09.309 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:09.309 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:09.309 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.309 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.568 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:09.568 "name": "raid_bdev1", 00:27:09.568 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:09.568 "strip_size_kb": 0, 00:27:09.568 "state": "online", 00:27:09.568 "raid_level": "raid1", 00:27:09.568 "superblock": true, 00:27:09.568 "num_base_bdevs": 4, 00:27:09.568 "num_base_bdevs_discovered": 3, 00:27:09.568 "num_base_bdevs_operational": 3, 00:27:09.568 "base_bdevs_list": [ 00:27:09.568 { 00:27:09.568 "name": null, 00:27:09.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:09.568 "is_configured": false, 00:27:09.568 "data_offset": 2048, 00:27:09.568 "data_size": 63488 00:27:09.568 }, 00:27:09.568 { 00:27:09.568 "name": "BaseBdev2", 00:27:09.568 "uuid": "ff15e32d-1926-57aa-9d48-b2dffa00cb04", 00:27:09.568 "is_configured": true, 00:27:09.568 "data_offset": 2048, 00:27:09.568 "data_size": 63488 00:27:09.568 }, 00:27:09.568 { 00:27:09.568 "name": "BaseBdev3", 00:27:09.568 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:09.568 "is_configured": true, 00:27:09.568 "data_offset": 2048, 00:27:09.568 "data_size": 63488 00:27:09.568 }, 00:27:09.568 { 00:27:09.568 "name": "BaseBdev4", 00:27:09.568 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:09.568 "is_configured": true, 00:27:09.568 "data_offset": 2048, 00:27:09.568 "data_size": 63488 00:27:09.568 } 00:27:09.568 ] 00:27:09.568 }' 00:27:09.568 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:09.568 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:09.568 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:09.827 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:09.827 20:41:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:10.085 [2024-07-15 20:41:02.219936] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:10.085 [2024-07-15 20:41:02.257790] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x209f8f0 00:27:10.085 [2024-07-15 20:41:02.259312] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:10.085 20:41:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:10.085 [2024-07-15 20:41:02.396986] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:10.085 [2024-07-15 20:41:02.398253] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:10.343 [2024-07-15 20:41:02.620954] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:10.343 [2024-07-15 20:41:02.621191] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:10.911 [2024-07-15 20:41:02.992016] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:10.911 [2024-07-15 20:41:02.993302] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:10.911 [2024-07-15 20:41:03.255520] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:10.911 [2024-07-15 20:41:03.255708] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:10.911 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:10.911 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:10.911 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:10.911 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:10.911 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:10.911 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.911 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:11.170 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:11.170 "name": "raid_bdev1", 00:27:11.170 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:11.170 "strip_size_kb": 0, 00:27:11.170 "state": "online", 00:27:11.170 "raid_level": "raid1", 00:27:11.170 "superblock": true, 00:27:11.170 "num_base_bdevs": 4, 00:27:11.170 "num_base_bdevs_discovered": 4, 00:27:11.170 "num_base_bdevs_operational": 4, 00:27:11.170 "process": { 00:27:11.170 "type": "rebuild", 00:27:11.170 "target": "spare", 00:27:11.170 "progress": { 00:27:11.170 "blocks": 12288, 00:27:11.170 "percent": 19 00:27:11.170 } 00:27:11.170 }, 00:27:11.170 "base_bdevs_list": [ 00:27:11.170 { 00:27:11.170 "name": "spare", 00:27:11.170 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:11.170 "is_configured": true, 00:27:11.170 "data_offset": 2048, 00:27:11.170 "data_size": 63488 00:27:11.170 }, 00:27:11.170 { 00:27:11.170 "name": "BaseBdev2", 00:27:11.170 "uuid": "ff15e32d-1926-57aa-9d48-b2dffa00cb04", 00:27:11.170 "is_configured": true, 00:27:11.170 "data_offset": 2048, 00:27:11.170 "data_size": 63488 00:27:11.170 }, 00:27:11.170 { 00:27:11.170 "name": "BaseBdev3", 00:27:11.170 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:11.170 "is_configured": true, 00:27:11.170 "data_offset": 2048, 00:27:11.170 "data_size": 63488 00:27:11.170 }, 00:27:11.170 { 00:27:11.170 "name": "BaseBdev4", 00:27:11.170 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:11.170 "is_configured": true, 00:27:11.170 "data_offset": 2048, 00:27:11.170 "data_size": 63488 00:27:11.170 } 00:27:11.170 ] 00:27:11.170 }' 00:27:11.170 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:11.429 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:11.429 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:11.429 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:11.429 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:11.429 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:11.429 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:11.429 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:27:11.429 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:11.429 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:27:11.429 20:41:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:11.687 [2024-07-15 20:41:03.825216] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:11.687 [2024-07-15 20:41:04.048673] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1f6a670 00:27:11.687 [2024-07-15 20:41:04.048706] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x209f8f0 00:27:11.946 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:27:11.946 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:27:11.946 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:11.946 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:11.946 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:11.946 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:11.946 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:11.946 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:11.946 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:12.205 "name": "raid_bdev1", 00:27:12.205 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:12.205 "strip_size_kb": 0, 00:27:12.205 "state": "online", 00:27:12.205 "raid_level": "raid1", 00:27:12.205 "superblock": true, 00:27:12.205 "num_base_bdevs": 4, 00:27:12.205 "num_base_bdevs_discovered": 3, 00:27:12.205 "num_base_bdevs_operational": 3, 00:27:12.205 "process": { 00:27:12.205 "type": "rebuild", 00:27:12.205 "target": "spare", 00:27:12.205 "progress": { 00:27:12.205 "blocks": 22528, 00:27:12.205 "percent": 35 00:27:12.205 } 00:27:12.205 }, 00:27:12.205 "base_bdevs_list": [ 00:27:12.205 { 00:27:12.205 "name": "spare", 00:27:12.205 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:12.205 "is_configured": true, 00:27:12.205 "data_offset": 2048, 00:27:12.205 "data_size": 63488 00:27:12.205 }, 00:27:12.205 { 00:27:12.205 "name": null, 00:27:12.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.205 "is_configured": false, 00:27:12.205 "data_offset": 2048, 00:27:12.205 "data_size": 63488 00:27:12.205 }, 00:27:12.205 { 00:27:12.205 "name": "BaseBdev3", 00:27:12.205 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:12.205 "is_configured": true, 00:27:12.205 "data_offset": 2048, 00:27:12.205 "data_size": 63488 00:27:12.205 }, 00:27:12.205 { 00:27:12.205 "name": "BaseBdev4", 00:27:12.205 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:12.205 "is_configured": true, 00:27:12.205 "data_offset": 2048, 00:27:12.205 "data_size": 63488 00:27:12.205 } 00:27:12.205 ] 00:27:12.205 }' 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=1002 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.205 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.205 [2024-07-15 20:41:04.528779] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:27:12.464 [2024-07-15 20:41:04.631333] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:27:12.464 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:12.464 "name": "raid_bdev1", 00:27:12.464 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:12.464 "strip_size_kb": 0, 00:27:12.464 "state": "online", 00:27:12.464 "raid_level": "raid1", 00:27:12.464 "superblock": true, 00:27:12.464 "num_base_bdevs": 4, 00:27:12.464 "num_base_bdevs_discovered": 3, 00:27:12.464 "num_base_bdevs_operational": 3, 00:27:12.464 "process": { 00:27:12.464 "type": "rebuild", 00:27:12.464 "target": "spare", 00:27:12.464 "progress": { 00:27:12.464 "blocks": 28672, 00:27:12.464 "percent": 45 00:27:12.464 } 00:27:12.464 }, 00:27:12.464 "base_bdevs_list": [ 00:27:12.464 { 00:27:12.464 "name": "spare", 00:27:12.464 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:12.464 "is_configured": true, 00:27:12.464 "data_offset": 2048, 00:27:12.464 "data_size": 63488 00:27:12.464 }, 00:27:12.464 { 00:27:12.464 "name": null, 00:27:12.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.464 "is_configured": false, 00:27:12.464 "data_offset": 2048, 00:27:12.464 "data_size": 63488 00:27:12.464 }, 00:27:12.464 { 00:27:12.464 "name": "BaseBdev3", 00:27:12.464 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:12.464 "is_configured": true, 00:27:12.464 "data_offset": 2048, 00:27:12.464 "data_size": 63488 00:27:12.464 }, 00:27:12.464 { 00:27:12.464 "name": "BaseBdev4", 00:27:12.464 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:12.464 "is_configured": true, 00:27:12.464 "data_offset": 2048, 00:27:12.464 "data_size": 63488 00:27:12.464 } 00:27:12.464 ] 00:27:12.464 }' 00:27:12.464 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:12.464 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:12.464 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:12.464 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:12.464 20:41:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:12.723 [2024-07-15 20:41:04.890272] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:27:12.982 [2024-07-15 20:41:05.317490] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:27:12.982 [2024-07-15 20:41:05.317966] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:27:13.580 [2024-07-15 20:41:05.668272] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:27:13.580 20:41:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:13.580 20:41:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:13.580 20:41:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:13.580 20:41:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:13.580 20:41:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:13.580 20:41:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:13.580 20:41:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.580 20:41:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.580 [2024-07-15 20:41:05.899635] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:27:13.839 20:41:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:13.839 "name": "raid_bdev1", 00:27:13.839 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:13.839 "strip_size_kb": 0, 00:27:13.839 "state": "online", 00:27:13.839 "raid_level": "raid1", 00:27:13.839 "superblock": true, 00:27:13.839 "num_base_bdevs": 4, 00:27:13.839 "num_base_bdevs_discovered": 3, 00:27:13.839 "num_base_bdevs_operational": 3, 00:27:13.839 "process": { 00:27:13.839 "type": "rebuild", 00:27:13.839 "target": "spare", 00:27:13.839 "progress": { 00:27:13.839 "blocks": 47104, 00:27:13.839 "percent": 74 00:27:13.839 } 00:27:13.839 }, 00:27:13.839 "base_bdevs_list": [ 00:27:13.839 { 00:27:13.839 "name": "spare", 00:27:13.839 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:13.839 "is_configured": true, 00:27:13.839 "data_offset": 2048, 00:27:13.839 "data_size": 63488 00:27:13.839 }, 00:27:13.839 { 00:27:13.839 "name": null, 00:27:13.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:13.839 "is_configured": false, 00:27:13.839 "data_offset": 2048, 00:27:13.839 "data_size": 63488 00:27:13.839 }, 00:27:13.839 { 00:27:13.839 "name": "BaseBdev3", 00:27:13.839 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:13.839 "is_configured": true, 00:27:13.839 "data_offset": 2048, 00:27:13.839 "data_size": 63488 00:27:13.839 }, 00:27:13.839 { 00:27:13.839 "name": "BaseBdev4", 00:27:13.839 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:13.839 "is_configured": true, 00:27:13.839 "data_offset": 2048, 00:27:13.839 "data_size": 63488 00:27:13.839 } 00:27:13.839 ] 00:27:13.839 }' 00:27:13.839 20:41:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:13.839 20:41:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:13.839 20:41:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:13.839 20:41:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:13.839 20:41:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:14.406 [2024-07-15 20:41:06.568201] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:27:14.664 [2024-07-15 20:41:06.901820] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:14.664 [2024-07-15 20:41:07.010051] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:14.664 [2024-07-15 20:41:07.012263] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:14.922 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:14.922 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:14.922 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:14.922 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:14.922 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:14.922 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:14.922 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.922 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:15.180 "name": "raid_bdev1", 00:27:15.180 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:15.180 "strip_size_kb": 0, 00:27:15.180 "state": "online", 00:27:15.180 "raid_level": "raid1", 00:27:15.180 "superblock": true, 00:27:15.180 "num_base_bdevs": 4, 00:27:15.180 "num_base_bdevs_discovered": 3, 00:27:15.180 "num_base_bdevs_operational": 3, 00:27:15.180 "base_bdevs_list": [ 00:27:15.180 { 00:27:15.180 "name": "spare", 00:27:15.180 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:15.180 "is_configured": true, 00:27:15.180 "data_offset": 2048, 00:27:15.180 "data_size": 63488 00:27:15.180 }, 00:27:15.180 { 00:27:15.180 "name": null, 00:27:15.180 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.180 "is_configured": false, 00:27:15.180 "data_offset": 2048, 00:27:15.180 "data_size": 63488 00:27:15.180 }, 00:27:15.180 { 00:27:15.180 "name": "BaseBdev3", 00:27:15.180 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:15.180 "is_configured": true, 00:27:15.180 "data_offset": 2048, 00:27:15.180 "data_size": 63488 00:27:15.180 }, 00:27:15.180 { 00:27:15.180 "name": "BaseBdev4", 00:27:15.180 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:15.180 "is_configured": true, 00:27:15.180 "data_offset": 2048, 00:27:15.180 "data_size": 63488 00:27:15.180 } 00:27:15.180 ] 00:27:15.180 }' 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.180 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.437 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:15.437 "name": "raid_bdev1", 00:27:15.437 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:15.437 "strip_size_kb": 0, 00:27:15.437 "state": "online", 00:27:15.437 "raid_level": "raid1", 00:27:15.437 "superblock": true, 00:27:15.437 "num_base_bdevs": 4, 00:27:15.437 "num_base_bdevs_discovered": 3, 00:27:15.437 "num_base_bdevs_operational": 3, 00:27:15.437 "base_bdevs_list": [ 00:27:15.437 { 00:27:15.437 "name": "spare", 00:27:15.437 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:15.437 "is_configured": true, 00:27:15.437 "data_offset": 2048, 00:27:15.438 "data_size": 63488 00:27:15.438 }, 00:27:15.438 { 00:27:15.438 "name": null, 00:27:15.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.438 "is_configured": false, 00:27:15.438 "data_offset": 2048, 00:27:15.438 "data_size": 63488 00:27:15.438 }, 00:27:15.438 { 00:27:15.438 "name": "BaseBdev3", 00:27:15.438 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:15.438 "is_configured": true, 00:27:15.438 "data_offset": 2048, 00:27:15.438 "data_size": 63488 00:27:15.438 }, 00:27:15.438 { 00:27:15.438 "name": "BaseBdev4", 00:27:15.438 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:15.438 "is_configured": true, 00:27:15.438 "data_offset": 2048, 00:27:15.438 "data_size": 63488 00:27:15.438 } 00:27:15.438 ] 00:27:15.438 }' 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.438 20:41:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.696 20:41:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:15.696 "name": "raid_bdev1", 00:27:15.696 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:15.696 "strip_size_kb": 0, 00:27:15.696 "state": "online", 00:27:15.696 "raid_level": "raid1", 00:27:15.696 "superblock": true, 00:27:15.696 "num_base_bdevs": 4, 00:27:15.696 "num_base_bdevs_discovered": 3, 00:27:15.696 "num_base_bdevs_operational": 3, 00:27:15.696 "base_bdevs_list": [ 00:27:15.696 { 00:27:15.696 "name": "spare", 00:27:15.696 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:15.696 "is_configured": true, 00:27:15.696 "data_offset": 2048, 00:27:15.696 "data_size": 63488 00:27:15.696 }, 00:27:15.696 { 00:27:15.696 "name": null, 00:27:15.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.696 "is_configured": false, 00:27:15.696 "data_offset": 2048, 00:27:15.696 "data_size": 63488 00:27:15.696 }, 00:27:15.696 { 00:27:15.696 "name": "BaseBdev3", 00:27:15.696 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:15.696 "is_configured": true, 00:27:15.696 "data_offset": 2048, 00:27:15.696 "data_size": 63488 00:27:15.696 }, 00:27:15.696 { 00:27:15.696 "name": "BaseBdev4", 00:27:15.696 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:15.696 "is_configured": true, 00:27:15.696 "data_offset": 2048, 00:27:15.696 "data_size": 63488 00:27:15.696 } 00:27:15.696 ] 00:27:15.696 }' 00:27:15.696 20:41:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:15.696 20:41:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:16.631 20:41:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:16.631 [2024-07-15 20:41:08.868699] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:16.631 [2024-07-15 20:41:08.868733] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:16.631 00:27:16.631 Latency(us) 00:27:16.631 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:16.631 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:27:16.631 raid_bdev1 : 11.06 82.22 246.67 0.00 0.00 16172.59 304.53 124005.51 00:27:16.631 =================================================================================================================== 00:27:16.631 Total : 82.22 246.67 0.00 0.00 16172.59 304.53 124005.51 00:27:16.631 [2024-07-15 20:41:08.896739] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:16.631 [2024-07-15 20:41:08.896769] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:16.631 [2024-07-15 20:41:08.896863] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:16.631 [2024-07-15 20:41:08.896881] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x209b8a0 name raid_bdev1, state offline 00:27:16.631 0 00:27:16.631 20:41:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.631 20:41:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:16.890 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:27:17.148 /dev/nbd0 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:17.148 1+0 records in 00:27:17.148 1+0 records out 00:27:17.148 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00029273 s, 14.0 MB/s 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:27:17.148 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:17.149 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:27:17.408 /dev/nbd1 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:17.408 1+0 records in 00:27:17.408 1+0 records out 00:27:17.408 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000201994 s, 20.3 MB/s 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:17.408 20:41:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:17.667 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:27:17.925 /dev/nbd1 00:27:17.925 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:18.183 1+0 records in 00:27:18.183 1+0 records out 00:27:18.183 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287076 s, 14.3 MB/s 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:18.183 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:18.442 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:18.701 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:18.701 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:18.701 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:18.701 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:18.701 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:18.701 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:18.701 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:27:18.701 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:18.701 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:18.701 20:41:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:18.960 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:19.220 [2024-07-15 20:41:11.411597] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:19.220 [2024-07-15 20:41:11.411644] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:19.220 [2024-07-15 20:41:11.411666] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x211b0e0 00:27:19.220 [2024-07-15 20:41:11.411684] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:19.220 [2024-07-15 20:41:11.413329] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:19.220 [2024-07-15 20:41:11.413358] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:19.220 [2024-07-15 20:41:11.413442] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:19.220 [2024-07-15 20:41:11.413473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:19.220 [2024-07-15 20:41:11.413579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:19.220 [2024-07-15 20:41:11.413656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:19.220 spare 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.220 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:19.220 [2024-07-15 20:41:11.513972] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x209cb10 00:27:19.220 [2024-07-15 20:41:11.513992] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:19.220 [2024-07-15 20:41:11.514198] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x209f890 00:27:19.220 [2024-07-15 20:41:11.514347] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x209cb10 00:27:19.220 [2024-07-15 20:41:11.514358] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x209cb10 00:27:19.220 [2024-07-15 20:41:11.514467] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:19.510 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:19.510 "name": "raid_bdev1", 00:27:19.510 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:19.510 "strip_size_kb": 0, 00:27:19.510 "state": "online", 00:27:19.510 "raid_level": "raid1", 00:27:19.510 "superblock": true, 00:27:19.510 "num_base_bdevs": 4, 00:27:19.510 "num_base_bdevs_discovered": 3, 00:27:19.510 "num_base_bdevs_operational": 3, 00:27:19.510 "base_bdevs_list": [ 00:27:19.510 { 00:27:19.510 "name": "spare", 00:27:19.510 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:19.510 "is_configured": true, 00:27:19.510 "data_offset": 2048, 00:27:19.510 "data_size": 63488 00:27:19.510 }, 00:27:19.510 { 00:27:19.510 "name": null, 00:27:19.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.510 "is_configured": false, 00:27:19.510 "data_offset": 2048, 00:27:19.510 "data_size": 63488 00:27:19.510 }, 00:27:19.510 { 00:27:19.510 "name": "BaseBdev3", 00:27:19.510 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:19.510 "is_configured": true, 00:27:19.510 "data_offset": 2048, 00:27:19.510 "data_size": 63488 00:27:19.510 }, 00:27:19.510 { 00:27:19.510 "name": "BaseBdev4", 00:27:19.510 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:19.510 "is_configured": true, 00:27:19.510 "data_offset": 2048, 00:27:19.510 "data_size": 63488 00:27:19.510 } 00:27:19.510 ] 00:27:19.510 }' 00:27:19.510 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:19.510 20:41:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:20.078 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:20.078 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:20.078 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:20.078 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:20.078 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:20.078 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.078 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.337 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:20.337 "name": "raid_bdev1", 00:27:20.337 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:20.337 "strip_size_kb": 0, 00:27:20.337 "state": "online", 00:27:20.337 "raid_level": "raid1", 00:27:20.337 "superblock": true, 00:27:20.337 "num_base_bdevs": 4, 00:27:20.337 "num_base_bdevs_discovered": 3, 00:27:20.337 "num_base_bdevs_operational": 3, 00:27:20.337 "base_bdevs_list": [ 00:27:20.337 { 00:27:20.337 "name": "spare", 00:27:20.337 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:20.337 "is_configured": true, 00:27:20.337 "data_offset": 2048, 00:27:20.337 "data_size": 63488 00:27:20.337 }, 00:27:20.337 { 00:27:20.337 "name": null, 00:27:20.337 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:20.337 "is_configured": false, 00:27:20.337 "data_offset": 2048, 00:27:20.337 "data_size": 63488 00:27:20.337 }, 00:27:20.337 { 00:27:20.337 "name": "BaseBdev3", 00:27:20.337 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:20.337 "is_configured": true, 00:27:20.337 "data_offset": 2048, 00:27:20.337 "data_size": 63488 00:27:20.337 }, 00:27:20.337 { 00:27:20.337 "name": "BaseBdev4", 00:27:20.337 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:20.337 "is_configured": true, 00:27:20.337 "data_offset": 2048, 00:27:20.337 "data_size": 63488 00:27:20.337 } 00:27:20.337 ] 00:27:20.337 }' 00:27:20.337 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:20.337 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:20.337 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:20.337 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:20.337 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.337 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:20.596 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:20.596 20:41:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:20.854 [2024-07-15 20:41:13.024178] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:20.854 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:20.854 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:20.854 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:20.854 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:20.854 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:20.855 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:20.855 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:20.855 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:20.855 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:20.855 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:20.855 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.855 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.112 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.112 "name": "raid_bdev1", 00:27:21.112 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:21.112 "strip_size_kb": 0, 00:27:21.112 "state": "online", 00:27:21.112 "raid_level": "raid1", 00:27:21.112 "superblock": true, 00:27:21.112 "num_base_bdevs": 4, 00:27:21.112 "num_base_bdevs_discovered": 2, 00:27:21.112 "num_base_bdevs_operational": 2, 00:27:21.112 "base_bdevs_list": [ 00:27:21.112 { 00:27:21.112 "name": null, 00:27:21.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.112 "is_configured": false, 00:27:21.112 "data_offset": 2048, 00:27:21.112 "data_size": 63488 00:27:21.112 }, 00:27:21.112 { 00:27:21.112 "name": null, 00:27:21.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.112 "is_configured": false, 00:27:21.112 "data_offset": 2048, 00:27:21.112 "data_size": 63488 00:27:21.112 }, 00:27:21.112 { 00:27:21.112 "name": "BaseBdev3", 00:27:21.112 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:21.112 "is_configured": true, 00:27:21.112 "data_offset": 2048, 00:27:21.112 "data_size": 63488 00:27:21.112 }, 00:27:21.112 { 00:27:21.112 "name": "BaseBdev4", 00:27:21.112 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:21.112 "is_configured": true, 00:27:21.112 "data_offset": 2048, 00:27:21.112 "data_size": 63488 00:27:21.112 } 00:27:21.112 ] 00:27:21.112 }' 00:27:21.112 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.112 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:21.677 20:41:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:21.935 [2024-07-15 20:41:14.123267] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:21.935 [2024-07-15 20:41:14.123421] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:21.935 [2024-07-15 20:41:14.123436] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:21.935 [2024-07-15 20:41:14.123463] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:21.935 [2024-07-15 20:41:14.127885] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x209ecc0 00:27:21.935 [2024-07-15 20:41:14.130130] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:21.935 20:41:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:22.870 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:22.870 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:22.870 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:22.870 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:22.870 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:22.870 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.870 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.129 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:23.129 "name": "raid_bdev1", 00:27:23.129 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:23.129 "strip_size_kb": 0, 00:27:23.129 "state": "online", 00:27:23.129 "raid_level": "raid1", 00:27:23.129 "superblock": true, 00:27:23.129 "num_base_bdevs": 4, 00:27:23.129 "num_base_bdevs_discovered": 3, 00:27:23.129 "num_base_bdevs_operational": 3, 00:27:23.129 "process": { 00:27:23.129 "type": "rebuild", 00:27:23.129 "target": "spare", 00:27:23.129 "progress": { 00:27:23.129 "blocks": 24576, 00:27:23.129 "percent": 38 00:27:23.129 } 00:27:23.129 }, 00:27:23.129 "base_bdevs_list": [ 00:27:23.129 { 00:27:23.129 "name": "spare", 00:27:23.129 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:23.129 "is_configured": true, 00:27:23.129 "data_offset": 2048, 00:27:23.129 "data_size": 63488 00:27:23.129 }, 00:27:23.129 { 00:27:23.129 "name": null, 00:27:23.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.129 "is_configured": false, 00:27:23.129 "data_offset": 2048, 00:27:23.129 "data_size": 63488 00:27:23.129 }, 00:27:23.129 { 00:27:23.129 "name": "BaseBdev3", 00:27:23.129 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:23.129 "is_configured": true, 00:27:23.129 "data_offset": 2048, 00:27:23.129 "data_size": 63488 00:27:23.129 }, 00:27:23.129 { 00:27:23.129 "name": "BaseBdev4", 00:27:23.129 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:23.129 "is_configured": true, 00:27:23.129 "data_offset": 2048, 00:27:23.129 "data_size": 63488 00:27:23.129 } 00:27:23.129 ] 00:27:23.129 }' 00:27:23.129 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:23.129 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:23.129 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:23.129 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:23.129 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:23.388 [2024-07-15 20:41:15.706847] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:23.388 [2024-07-15 20:41:15.742896] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:23.388 [2024-07-15 20:41:15.742948] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:23.388 [2024-07-15 20:41:15.742965] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:23.388 [2024-07-15 20:41:15.742973] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.646 20:41:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.646 20:41:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.646 "name": "raid_bdev1", 00:27:23.646 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:23.646 "strip_size_kb": 0, 00:27:23.646 "state": "online", 00:27:23.646 "raid_level": "raid1", 00:27:23.646 "superblock": true, 00:27:23.646 "num_base_bdevs": 4, 00:27:23.647 "num_base_bdevs_discovered": 2, 00:27:23.647 "num_base_bdevs_operational": 2, 00:27:23.647 "base_bdevs_list": [ 00:27:23.647 { 00:27:23.647 "name": null, 00:27:23.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.647 "is_configured": false, 00:27:23.647 "data_offset": 2048, 00:27:23.647 "data_size": 63488 00:27:23.647 }, 00:27:23.647 { 00:27:23.647 "name": null, 00:27:23.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.647 "is_configured": false, 00:27:23.647 "data_offset": 2048, 00:27:23.647 "data_size": 63488 00:27:23.647 }, 00:27:23.647 { 00:27:23.647 "name": "BaseBdev3", 00:27:23.647 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:23.647 "is_configured": true, 00:27:23.647 "data_offset": 2048, 00:27:23.647 "data_size": 63488 00:27:23.647 }, 00:27:23.647 { 00:27:23.647 "name": "BaseBdev4", 00:27:23.647 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:23.647 "is_configured": true, 00:27:23.647 "data_offset": 2048, 00:27:23.647 "data_size": 63488 00:27:23.647 } 00:27:23.647 ] 00:27:23.647 }' 00:27:23.647 20:41:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.647 20:41:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:24.583 20:41:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:24.583 [2024-07-15 20:41:16.842229] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:24.583 [2024-07-15 20:41:16.842284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:24.583 [2024-07-15 20:41:16.842309] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x209da90 00:27:24.583 [2024-07-15 20:41:16.842322] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:24.583 [2024-07-15 20:41:16.842701] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:24.583 [2024-07-15 20:41:16.842720] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:24.583 [2024-07-15 20:41:16.842801] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:24.583 [2024-07-15 20:41:16.842814] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:24.583 [2024-07-15 20:41:16.842825] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:24.583 [2024-07-15 20:41:16.842844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:24.583 [2024-07-15 20:41:16.847311] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f69d50 00:27:24.583 spare 00:27:24.583 [2024-07-15 20:41:16.848797] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:24.583 20:41:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:25.518 20:41:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:25.518 20:41:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:25.518 20:41:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:25.518 20:41:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:25.518 20:41:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:25.519 20:41:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.519 20:41:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:25.777 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:25.777 "name": "raid_bdev1", 00:27:25.777 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:25.777 "strip_size_kb": 0, 00:27:25.777 "state": "online", 00:27:25.777 "raid_level": "raid1", 00:27:25.777 "superblock": true, 00:27:25.777 "num_base_bdevs": 4, 00:27:25.777 "num_base_bdevs_discovered": 3, 00:27:25.777 "num_base_bdevs_operational": 3, 00:27:25.777 "process": { 00:27:25.777 "type": "rebuild", 00:27:25.777 "target": "spare", 00:27:25.777 "progress": { 00:27:25.777 "blocks": 24576, 00:27:25.777 "percent": 38 00:27:25.777 } 00:27:25.777 }, 00:27:25.777 "base_bdevs_list": [ 00:27:25.777 { 00:27:25.777 "name": "spare", 00:27:25.777 "uuid": "3bb25211-764a-50ee-9c93-e0d747556981", 00:27:25.777 "is_configured": true, 00:27:25.777 "data_offset": 2048, 00:27:25.777 "data_size": 63488 00:27:25.777 }, 00:27:25.777 { 00:27:25.777 "name": null, 00:27:25.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.777 "is_configured": false, 00:27:25.777 "data_offset": 2048, 00:27:25.777 "data_size": 63488 00:27:25.777 }, 00:27:25.777 { 00:27:25.777 "name": "BaseBdev3", 00:27:25.777 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:25.777 "is_configured": true, 00:27:25.777 "data_offset": 2048, 00:27:25.777 "data_size": 63488 00:27:25.777 }, 00:27:25.777 { 00:27:25.777 "name": "BaseBdev4", 00:27:25.777 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:25.777 "is_configured": true, 00:27:25.777 "data_offset": 2048, 00:27:25.777 "data_size": 63488 00:27:25.777 } 00:27:25.777 ] 00:27:25.777 }' 00:27:25.777 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:26.036 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:26.036 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:26.036 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:26.036 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:26.294 [2024-07-15 20:41:18.445363] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:26.295 [2024-07-15 20:41:18.461348] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:26.295 [2024-07-15 20:41:18.461392] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:26.295 [2024-07-15 20:41:18.461409] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:26.295 [2024-07-15 20:41:18.461417] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.295 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.553 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.553 "name": "raid_bdev1", 00:27:26.553 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:26.553 "strip_size_kb": 0, 00:27:26.553 "state": "online", 00:27:26.553 "raid_level": "raid1", 00:27:26.553 "superblock": true, 00:27:26.553 "num_base_bdevs": 4, 00:27:26.553 "num_base_bdevs_discovered": 2, 00:27:26.553 "num_base_bdevs_operational": 2, 00:27:26.553 "base_bdevs_list": [ 00:27:26.553 { 00:27:26.553 "name": null, 00:27:26.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.553 "is_configured": false, 00:27:26.553 "data_offset": 2048, 00:27:26.553 "data_size": 63488 00:27:26.553 }, 00:27:26.553 { 00:27:26.553 "name": null, 00:27:26.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.553 "is_configured": false, 00:27:26.553 "data_offset": 2048, 00:27:26.553 "data_size": 63488 00:27:26.553 }, 00:27:26.553 { 00:27:26.553 "name": "BaseBdev3", 00:27:26.553 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:26.553 "is_configured": true, 00:27:26.553 "data_offset": 2048, 00:27:26.553 "data_size": 63488 00:27:26.553 }, 00:27:26.553 { 00:27:26.553 "name": "BaseBdev4", 00:27:26.553 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:26.553 "is_configured": true, 00:27:26.553 "data_offset": 2048, 00:27:26.553 "data_size": 63488 00:27:26.553 } 00:27:26.553 ] 00:27:26.553 }' 00:27:26.554 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.554 20:41:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:27.120 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:27.120 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:27.120 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:27.120 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:27.120 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:27.120 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.120 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:27.379 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:27.379 "name": "raid_bdev1", 00:27:27.379 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:27.379 "strip_size_kb": 0, 00:27:27.379 "state": "online", 00:27:27.379 "raid_level": "raid1", 00:27:27.379 "superblock": true, 00:27:27.379 "num_base_bdevs": 4, 00:27:27.379 "num_base_bdevs_discovered": 2, 00:27:27.379 "num_base_bdevs_operational": 2, 00:27:27.379 "base_bdevs_list": [ 00:27:27.379 { 00:27:27.379 "name": null, 00:27:27.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.379 "is_configured": false, 00:27:27.379 "data_offset": 2048, 00:27:27.379 "data_size": 63488 00:27:27.379 }, 00:27:27.379 { 00:27:27.379 "name": null, 00:27:27.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.379 "is_configured": false, 00:27:27.379 "data_offset": 2048, 00:27:27.379 "data_size": 63488 00:27:27.379 }, 00:27:27.379 { 00:27:27.379 "name": "BaseBdev3", 00:27:27.379 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:27.379 "is_configured": true, 00:27:27.379 "data_offset": 2048, 00:27:27.379 "data_size": 63488 00:27:27.379 }, 00:27:27.379 { 00:27:27.379 "name": "BaseBdev4", 00:27:27.379 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:27.379 "is_configured": true, 00:27:27.379 "data_offset": 2048, 00:27:27.379 "data_size": 63488 00:27:27.379 } 00:27:27.379 ] 00:27:27.379 }' 00:27:27.379 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:27.379 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:27.379 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:27.379 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:27.379 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:27.638 20:41:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:27.896 [2024-07-15 20:41:20.162973] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:27.896 [2024-07-15 20:41:20.163028] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:27.896 [2024-07-15 20:41:20.163051] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x209e670 00:27:27.896 [2024-07-15 20:41:20.163064] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:27.896 [2024-07-15 20:41:20.163426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:27.896 [2024-07-15 20:41:20.163443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:27.896 [2024-07-15 20:41:20.163511] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:27.896 [2024-07-15 20:41:20.163525] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:27.896 [2024-07-15 20:41:20.163535] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:27.896 BaseBdev1 00:27:27.896 20:41:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.830 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.088 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:29.088 "name": "raid_bdev1", 00:27:29.088 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:29.088 "strip_size_kb": 0, 00:27:29.088 "state": "online", 00:27:29.088 "raid_level": "raid1", 00:27:29.088 "superblock": true, 00:27:29.088 "num_base_bdevs": 4, 00:27:29.088 "num_base_bdevs_discovered": 2, 00:27:29.088 "num_base_bdevs_operational": 2, 00:27:29.088 "base_bdevs_list": [ 00:27:29.088 { 00:27:29.088 "name": null, 00:27:29.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.088 "is_configured": false, 00:27:29.088 "data_offset": 2048, 00:27:29.088 "data_size": 63488 00:27:29.088 }, 00:27:29.088 { 00:27:29.088 "name": null, 00:27:29.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.088 "is_configured": false, 00:27:29.088 "data_offset": 2048, 00:27:29.088 "data_size": 63488 00:27:29.088 }, 00:27:29.088 { 00:27:29.088 "name": "BaseBdev3", 00:27:29.088 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:29.088 "is_configured": true, 00:27:29.088 "data_offset": 2048, 00:27:29.088 "data_size": 63488 00:27:29.088 }, 00:27:29.088 { 00:27:29.088 "name": "BaseBdev4", 00:27:29.088 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:29.088 "is_configured": true, 00:27:29.088 "data_offset": 2048, 00:27:29.088 "data_size": 63488 00:27:29.088 } 00:27:29.088 ] 00:27:29.088 }' 00:27:29.088 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:29.088 20:41:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:30.022 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:30.023 "name": "raid_bdev1", 00:27:30.023 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:30.023 "strip_size_kb": 0, 00:27:30.023 "state": "online", 00:27:30.023 "raid_level": "raid1", 00:27:30.023 "superblock": true, 00:27:30.023 "num_base_bdevs": 4, 00:27:30.023 "num_base_bdevs_discovered": 2, 00:27:30.023 "num_base_bdevs_operational": 2, 00:27:30.023 "base_bdevs_list": [ 00:27:30.023 { 00:27:30.023 "name": null, 00:27:30.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:30.023 "is_configured": false, 00:27:30.023 "data_offset": 2048, 00:27:30.023 "data_size": 63488 00:27:30.023 }, 00:27:30.023 { 00:27:30.023 "name": null, 00:27:30.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:30.023 "is_configured": false, 00:27:30.023 "data_offset": 2048, 00:27:30.023 "data_size": 63488 00:27:30.023 }, 00:27:30.023 { 00:27:30.023 "name": "BaseBdev3", 00:27:30.023 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:30.023 "is_configured": true, 00:27:30.023 "data_offset": 2048, 00:27:30.023 "data_size": 63488 00:27:30.023 }, 00:27:30.023 { 00:27:30.023 "name": "BaseBdev4", 00:27:30.023 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:30.023 "is_configured": true, 00:27:30.023 "data_offset": 2048, 00:27:30.023 "data_size": 63488 00:27:30.023 } 00:27:30.023 ] 00:27:30.023 }' 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:30.023 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:30.282 [2024-07-15 20:41:22.609779] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:30.282 [2024-07-15 20:41:22.609919] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:30.282 [2024-07-15 20:41:22.609941] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:30.282 request: 00:27:30.282 { 00:27:30.282 "base_bdev": "BaseBdev1", 00:27:30.282 "raid_bdev": "raid_bdev1", 00:27:30.282 "method": "bdev_raid_add_base_bdev", 00:27:30.282 "req_id": 1 00:27:30.282 } 00:27:30.282 Got JSON-RPC error response 00:27:30.282 response: 00:27:30.282 { 00:27:30.282 "code": -22, 00:27:30.282 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:30.282 } 00:27:30.282 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:27:30.282 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:30.282 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:30.282 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:30.282 20:41:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:31.693 "name": "raid_bdev1", 00:27:31.693 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:31.693 "strip_size_kb": 0, 00:27:31.693 "state": "online", 00:27:31.693 "raid_level": "raid1", 00:27:31.693 "superblock": true, 00:27:31.693 "num_base_bdevs": 4, 00:27:31.693 "num_base_bdevs_discovered": 2, 00:27:31.693 "num_base_bdevs_operational": 2, 00:27:31.693 "base_bdevs_list": [ 00:27:31.693 { 00:27:31.693 "name": null, 00:27:31.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.693 "is_configured": false, 00:27:31.693 "data_offset": 2048, 00:27:31.693 "data_size": 63488 00:27:31.693 }, 00:27:31.693 { 00:27:31.693 "name": null, 00:27:31.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.693 "is_configured": false, 00:27:31.693 "data_offset": 2048, 00:27:31.693 "data_size": 63488 00:27:31.693 }, 00:27:31.693 { 00:27:31.693 "name": "BaseBdev3", 00:27:31.693 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:31.693 "is_configured": true, 00:27:31.693 "data_offset": 2048, 00:27:31.693 "data_size": 63488 00:27:31.693 }, 00:27:31.693 { 00:27:31.693 "name": "BaseBdev4", 00:27:31.693 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:31.693 "is_configured": true, 00:27:31.693 "data_offset": 2048, 00:27:31.693 "data_size": 63488 00:27:31.693 } 00:27:31.693 ] 00:27:31.693 }' 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:31.693 20:41:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:32.261 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:32.261 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:32.261 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:32.261 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:32.261 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:32.261 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.261 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:32.520 "name": "raid_bdev1", 00:27:32.520 "uuid": "def3905d-a02e-4e4d-9f27-65e275e658fa", 00:27:32.520 "strip_size_kb": 0, 00:27:32.520 "state": "online", 00:27:32.520 "raid_level": "raid1", 00:27:32.520 "superblock": true, 00:27:32.520 "num_base_bdevs": 4, 00:27:32.520 "num_base_bdevs_discovered": 2, 00:27:32.520 "num_base_bdevs_operational": 2, 00:27:32.520 "base_bdevs_list": [ 00:27:32.520 { 00:27:32.520 "name": null, 00:27:32.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.520 "is_configured": false, 00:27:32.520 "data_offset": 2048, 00:27:32.520 "data_size": 63488 00:27:32.520 }, 00:27:32.520 { 00:27:32.520 "name": null, 00:27:32.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.520 "is_configured": false, 00:27:32.520 "data_offset": 2048, 00:27:32.520 "data_size": 63488 00:27:32.520 }, 00:27:32.520 { 00:27:32.520 "name": "BaseBdev3", 00:27:32.520 "uuid": "7909ef61-a74c-5dc1-bdb6-0a581b9ffa17", 00:27:32.520 "is_configured": true, 00:27:32.520 "data_offset": 2048, 00:27:32.520 "data_size": 63488 00:27:32.520 }, 00:27:32.520 { 00:27:32.520 "name": "BaseBdev4", 00:27:32.520 "uuid": "9673f532-83c5-59e8-9256-b2534b3ba1e1", 00:27:32.520 "is_configured": true, 00:27:32.520 "data_offset": 2048, 00:27:32.520 "data_size": 63488 00:27:32.520 } 00:27:32.520 ] 00:27:32.520 }' 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1488256 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 1488256 ']' 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 1488256 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1488256 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1488256' 00:27:32.520 killing process with pid 1488256 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 1488256 00:27:32.520 Received shutdown signal, test time was about 26.938539 seconds 00:27:32.520 00:27:32.520 Latency(us) 00:27:32.520 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:32.520 =================================================================================================================== 00:27:32.520 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:32.520 [2024-07-15 20:41:24.814869] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:32.520 [2024-07-15 20:41:24.814982] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:32.520 20:41:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 1488256 00:27:32.520 [2024-07-15 20:41:24.815052] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:32.520 [2024-07-15 20:41:24.815067] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x209cb10 name raid_bdev1, state offline 00:27:32.520 [2024-07-15 20:41:24.856079] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:32.781 20:41:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:27:32.781 00:27:32.781 real 0m31.316s 00:27:32.781 user 0m49.520s 00:27:32.781 sys 0m4.949s 00:27:32.781 20:41:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:32.781 20:41:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:32.781 ************************************ 00:27:32.781 END TEST raid_rebuild_test_sb_io 00:27:32.781 ************************************ 00:27:32.781 20:41:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:32.781 20:41:25 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:27:32.781 20:41:25 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:27:32.781 20:41:25 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:27:32.781 20:41:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:32.781 20:41:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:32.781 20:41:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:32.781 ************************************ 00:27:32.781 START TEST raid_state_function_test_sb_4k 00:27:32.781 ************************************ 00:27:32.781 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:32.781 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:32.781 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:32.781 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:32.781 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:32.781 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:32.781 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:32.781 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:32.781 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:32.781 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1492755 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1492755' 00:27:33.041 Process raid pid: 1492755 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1492755 /var/tmp/spdk-raid.sock 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 1492755 ']' 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:33.041 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:33.041 20:41:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:33.041 [2024-07-15 20:41:25.225026] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:27:33.041 [2024-07-15 20:41:25.225096] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:33.041 [2024-07-15 20:41:25.355225] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:33.301 [2024-07-15 20:41:25.457976] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:33.301 [2024-07-15 20:41:25.517055] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:33.301 [2024-07-15 20:41:25.517112] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:33.868 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:33.868 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:27:33.868 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:34.435 [2024-07-15 20:41:26.641967] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:34.435 [2024-07-15 20:41:26.642012] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:34.436 [2024-07-15 20:41:26.642024] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:34.436 [2024-07-15 20:41:26.642037] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.436 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:34.694 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:34.694 "name": "Existed_Raid", 00:27:34.694 "uuid": "649dcec1-efb0-4bff-b5b8-f8c408dd1d65", 00:27:34.694 "strip_size_kb": 0, 00:27:34.694 "state": "configuring", 00:27:34.694 "raid_level": "raid1", 00:27:34.694 "superblock": true, 00:27:34.694 "num_base_bdevs": 2, 00:27:34.694 "num_base_bdevs_discovered": 0, 00:27:34.694 "num_base_bdevs_operational": 2, 00:27:34.694 "base_bdevs_list": [ 00:27:34.695 { 00:27:34.695 "name": "BaseBdev1", 00:27:34.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.695 "is_configured": false, 00:27:34.695 "data_offset": 0, 00:27:34.695 "data_size": 0 00:27:34.695 }, 00:27:34.695 { 00:27:34.695 "name": "BaseBdev2", 00:27:34.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.695 "is_configured": false, 00:27:34.695 "data_offset": 0, 00:27:34.695 "data_size": 0 00:27:34.695 } 00:27:34.695 ] 00:27:34.695 }' 00:27:34.695 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:34.695 20:41:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:35.268 20:41:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:35.532 [2024-07-15 20:41:27.740858] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:35.532 [2024-07-15 20:41:27.740887] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c6ba80 name Existed_Raid, state configuring 00:27:35.532 20:41:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:35.790 [2024-07-15 20:41:27.989546] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:35.790 [2024-07-15 20:41:27.989573] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:35.790 [2024-07-15 20:41:27.989582] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:35.790 [2024-07-15 20:41:27.989594] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:35.790 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:27:36.049 [2024-07-15 20:41:28.248094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:36.049 BaseBdev1 00:27:36.049 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:36.049 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:36.049 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:36.049 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:27:36.049 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:36.049 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:36.049 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:36.307 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:36.566 [ 00:27:36.566 { 00:27:36.566 "name": "BaseBdev1", 00:27:36.566 "aliases": [ 00:27:36.566 "513d80ad-9b96-43c6-b63c-1f344d95776a" 00:27:36.566 ], 00:27:36.566 "product_name": "Malloc disk", 00:27:36.566 "block_size": 4096, 00:27:36.566 "num_blocks": 8192, 00:27:36.566 "uuid": "513d80ad-9b96-43c6-b63c-1f344d95776a", 00:27:36.566 "assigned_rate_limits": { 00:27:36.566 "rw_ios_per_sec": 0, 00:27:36.566 "rw_mbytes_per_sec": 0, 00:27:36.566 "r_mbytes_per_sec": 0, 00:27:36.566 "w_mbytes_per_sec": 0 00:27:36.566 }, 00:27:36.566 "claimed": true, 00:27:36.566 "claim_type": "exclusive_write", 00:27:36.566 "zoned": false, 00:27:36.566 "supported_io_types": { 00:27:36.566 "read": true, 00:27:36.566 "write": true, 00:27:36.566 "unmap": true, 00:27:36.566 "flush": true, 00:27:36.566 "reset": true, 00:27:36.566 "nvme_admin": false, 00:27:36.566 "nvme_io": false, 00:27:36.566 "nvme_io_md": false, 00:27:36.566 "write_zeroes": true, 00:27:36.566 "zcopy": true, 00:27:36.566 "get_zone_info": false, 00:27:36.566 "zone_management": false, 00:27:36.566 "zone_append": false, 00:27:36.566 "compare": false, 00:27:36.566 "compare_and_write": false, 00:27:36.566 "abort": true, 00:27:36.566 "seek_hole": false, 00:27:36.566 "seek_data": false, 00:27:36.566 "copy": true, 00:27:36.566 "nvme_iov_md": false 00:27:36.566 }, 00:27:36.566 "memory_domains": [ 00:27:36.566 { 00:27:36.566 "dma_device_id": "system", 00:27:36.566 "dma_device_type": 1 00:27:36.566 }, 00:27:36.566 { 00:27:36.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:36.566 "dma_device_type": 2 00:27:36.566 } 00:27:36.566 ], 00:27:36.566 "driver_specific": {} 00:27:36.566 } 00:27:36.566 ] 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:36.566 "name": "Existed_Raid", 00:27:36.566 "uuid": "2f5da570-4d94-469c-b321-6e8231581078", 00:27:36.566 "strip_size_kb": 0, 00:27:36.566 "state": "configuring", 00:27:36.566 "raid_level": "raid1", 00:27:36.566 "superblock": true, 00:27:36.566 "num_base_bdevs": 2, 00:27:36.566 "num_base_bdevs_discovered": 1, 00:27:36.566 "num_base_bdevs_operational": 2, 00:27:36.566 "base_bdevs_list": [ 00:27:36.566 { 00:27:36.566 "name": "BaseBdev1", 00:27:36.566 "uuid": "513d80ad-9b96-43c6-b63c-1f344d95776a", 00:27:36.566 "is_configured": true, 00:27:36.566 "data_offset": 256, 00:27:36.566 "data_size": 7936 00:27:36.566 }, 00:27:36.566 { 00:27:36.566 "name": "BaseBdev2", 00:27:36.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:36.566 "is_configured": false, 00:27:36.566 "data_offset": 0, 00:27:36.566 "data_size": 0 00:27:36.566 } 00:27:36.566 ] 00:27:36.566 }' 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:36.566 20:41:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:37.503 20:41:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:37.503 [2024-07-15 20:41:29.763994] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:37.503 [2024-07-15 20:41:29.764033] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c6b350 name Existed_Raid, state configuring 00:27:37.503 20:41:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:37.762 [2024-07-15 20:41:30.004668] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:37.762 [2024-07-15 20:41:30.006168] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:37.762 [2024-07-15 20:41:30.006201] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.762 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:38.041 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:38.041 "name": "Existed_Raid", 00:27:38.041 "uuid": "824647e7-5008-4214-a776-bd6678b00c03", 00:27:38.041 "strip_size_kb": 0, 00:27:38.041 "state": "configuring", 00:27:38.041 "raid_level": "raid1", 00:27:38.041 "superblock": true, 00:27:38.041 "num_base_bdevs": 2, 00:27:38.041 "num_base_bdevs_discovered": 1, 00:27:38.041 "num_base_bdevs_operational": 2, 00:27:38.041 "base_bdevs_list": [ 00:27:38.041 { 00:27:38.041 "name": "BaseBdev1", 00:27:38.041 "uuid": "513d80ad-9b96-43c6-b63c-1f344d95776a", 00:27:38.041 "is_configured": true, 00:27:38.041 "data_offset": 256, 00:27:38.041 "data_size": 7936 00:27:38.041 }, 00:27:38.041 { 00:27:38.041 "name": "BaseBdev2", 00:27:38.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:38.041 "is_configured": false, 00:27:38.041 "data_offset": 0, 00:27:38.041 "data_size": 0 00:27:38.041 } 00:27:38.041 ] 00:27:38.041 }' 00:27:38.041 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:38.041 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:38.608 20:41:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:27:38.868 [2024-07-15 20:41:31.058853] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:38.868 [2024-07-15 20:41:31.059022] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c6c000 00:27:38.868 [2024-07-15 20:41:31.059037] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:38.868 [2024-07-15 20:41:31.059210] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b860c0 00:27:38.868 [2024-07-15 20:41:31.059331] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c6c000 00:27:38.868 [2024-07-15 20:41:31.059341] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c6c000 00:27:38.868 [2024-07-15 20:41:31.059432] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:38.868 BaseBdev2 00:27:38.868 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:38.868 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:38.868 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:38.868 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:27:38.868 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:38.868 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:38.868 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:39.127 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:39.386 [ 00:27:39.386 { 00:27:39.386 "name": "BaseBdev2", 00:27:39.386 "aliases": [ 00:27:39.386 "4b67dc92-4830-4c8f-a507-cf352aa8b699" 00:27:39.386 ], 00:27:39.386 "product_name": "Malloc disk", 00:27:39.386 "block_size": 4096, 00:27:39.386 "num_blocks": 8192, 00:27:39.386 "uuid": "4b67dc92-4830-4c8f-a507-cf352aa8b699", 00:27:39.386 "assigned_rate_limits": { 00:27:39.386 "rw_ios_per_sec": 0, 00:27:39.386 "rw_mbytes_per_sec": 0, 00:27:39.386 "r_mbytes_per_sec": 0, 00:27:39.386 "w_mbytes_per_sec": 0 00:27:39.386 }, 00:27:39.386 "claimed": true, 00:27:39.386 "claim_type": "exclusive_write", 00:27:39.387 "zoned": false, 00:27:39.387 "supported_io_types": { 00:27:39.387 "read": true, 00:27:39.387 "write": true, 00:27:39.387 "unmap": true, 00:27:39.387 "flush": true, 00:27:39.387 "reset": true, 00:27:39.387 "nvme_admin": false, 00:27:39.387 "nvme_io": false, 00:27:39.387 "nvme_io_md": false, 00:27:39.387 "write_zeroes": true, 00:27:39.387 "zcopy": true, 00:27:39.387 "get_zone_info": false, 00:27:39.387 "zone_management": false, 00:27:39.387 "zone_append": false, 00:27:39.387 "compare": false, 00:27:39.387 "compare_and_write": false, 00:27:39.387 "abort": true, 00:27:39.387 "seek_hole": false, 00:27:39.387 "seek_data": false, 00:27:39.387 "copy": true, 00:27:39.387 "nvme_iov_md": false 00:27:39.387 }, 00:27:39.387 "memory_domains": [ 00:27:39.387 { 00:27:39.387 "dma_device_id": "system", 00:27:39.387 "dma_device_type": 1 00:27:39.387 }, 00:27:39.387 { 00:27:39.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:39.387 "dma_device_type": 2 00:27:39.387 } 00:27:39.387 ], 00:27:39.387 "driver_specific": {} 00:27:39.387 } 00:27:39.387 ] 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.387 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:39.646 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:39.646 "name": "Existed_Raid", 00:27:39.646 "uuid": "824647e7-5008-4214-a776-bd6678b00c03", 00:27:39.646 "strip_size_kb": 0, 00:27:39.646 "state": "online", 00:27:39.646 "raid_level": "raid1", 00:27:39.646 "superblock": true, 00:27:39.646 "num_base_bdevs": 2, 00:27:39.646 "num_base_bdevs_discovered": 2, 00:27:39.646 "num_base_bdevs_operational": 2, 00:27:39.646 "base_bdevs_list": [ 00:27:39.646 { 00:27:39.646 "name": "BaseBdev1", 00:27:39.646 "uuid": "513d80ad-9b96-43c6-b63c-1f344d95776a", 00:27:39.646 "is_configured": true, 00:27:39.646 "data_offset": 256, 00:27:39.646 "data_size": 7936 00:27:39.646 }, 00:27:39.646 { 00:27:39.646 "name": "BaseBdev2", 00:27:39.646 "uuid": "4b67dc92-4830-4c8f-a507-cf352aa8b699", 00:27:39.646 "is_configured": true, 00:27:39.646 "data_offset": 256, 00:27:39.646 "data_size": 7936 00:27:39.646 } 00:27:39.646 ] 00:27:39.646 }' 00:27:39.646 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:39.646 20:41:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:40.215 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:40.215 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:40.215 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:40.215 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:40.215 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:40.215 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:40.215 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:40.215 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:40.474 [2024-07-15 20:41:32.647351] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:40.474 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:40.474 "name": "Existed_Raid", 00:27:40.474 "aliases": [ 00:27:40.474 "824647e7-5008-4214-a776-bd6678b00c03" 00:27:40.474 ], 00:27:40.474 "product_name": "Raid Volume", 00:27:40.474 "block_size": 4096, 00:27:40.474 "num_blocks": 7936, 00:27:40.474 "uuid": "824647e7-5008-4214-a776-bd6678b00c03", 00:27:40.474 "assigned_rate_limits": { 00:27:40.474 "rw_ios_per_sec": 0, 00:27:40.474 "rw_mbytes_per_sec": 0, 00:27:40.474 "r_mbytes_per_sec": 0, 00:27:40.474 "w_mbytes_per_sec": 0 00:27:40.474 }, 00:27:40.474 "claimed": false, 00:27:40.474 "zoned": false, 00:27:40.474 "supported_io_types": { 00:27:40.474 "read": true, 00:27:40.474 "write": true, 00:27:40.474 "unmap": false, 00:27:40.474 "flush": false, 00:27:40.474 "reset": true, 00:27:40.474 "nvme_admin": false, 00:27:40.474 "nvme_io": false, 00:27:40.474 "nvme_io_md": false, 00:27:40.474 "write_zeroes": true, 00:27:40.474 "zcopy": false, 00:27:40.474 "get_zone_info": false, 00:27:40.474 "zone_management": false, 00:27:40.474 "zone_append": false, 00:27:40.474 "compare": false, 00:27:40.474 "compare_and_write": false, 00:27:40.474 "abort": false, 00:27:40.474 "seek_hole": false, 00:27:40.474 "seek_data": false, 00:27:40.474 "copy": false, 00:27:40.474 "nvme_iov_md": false 00:27:40.474 }, 00:27:40.474 "memory_domains": [ 00:27:40.474 { 00:27:40.474 "dma_device_id": "system", 00:27:40.474 "dma_device_type": 1 00:27:40.474 }, 00:27:40.474 { 00:27:40.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:40.474 "dma_device_type": 2 00:27:40.474 }, 00:27:40.474 { 00:27:40.474 "dma_device_id": "system", 00:27:40.474 "dma_device_type": 1 00:27:40.474 }, 00:27:40.474 { 00:27:40.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:40.474 "dma_device_type": 2 00:27:40.474 } 00:27:40.474 ], 00:27:40.474 "driver_specific": { 00:27:40.474 "raid": { 00:27:40.474 "uuid": "824647e7-5008-4214-a776-bd6678b00c03", 00:27:40.474 "strip_size_kb": 0, 00:27:40.474 "state": "online", 00:27:40.474 "raid_level": "raid1", 00:27:40.474 "superblock": true, 00:27:40.474 "num_base_bdevs": 2, 00:27:40.474 "num_base_bdevs_discovered": 2, 00:27:40.474 "num_base_bdevs_operational": 2, 00:27:40.474 "base_bdevs_list": [ 00:27:40.474 { 00:27:40.474 "name": "BaseBdev1", 00:27:40.474 "uuid": "513d80ad-9b96-43c6-b63c-1f344d95776a", 00:27:40.474 "is_configured": true, 00:27:40.474 "data_offset": 256, 00:27:40.474 "data_size": 7936 00:27:40.474 }, 00:27:40.474 { 00:27:40.474 "name": "BaseBdev2", 00:27:40.474 "uuid": "4b67dc92-4830-4c8f-a507-cf352aa8b699", 00:27:40.474 "is_configured": true, 00:27:40.474 "data_offset": 256, 00:27:40.474 "data_size": 7936 00:27:40.474 } 00:27:40.474 ] 00:27:40.474 } 00:27:40.474 } 00:27:40.474 }' 00:27:40.474 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:40.474 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:40.474 BaseBdev2' 00:27:40.474 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:40.474 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:40.474 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:40.733 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:40.733 "name": "BaseBdev1", 00:27:40.733 "aliases": [ 00:27:40.733 "513d80ad-9b96-43c6-b63c-1f344d95776a" 00:27:40.733 ], 00:27:40.733 "product_name": "Malloc disk", 00:27:40.733 "block_size": 4096, 00:27:40.733 "num_blocks": 8192, 00:27:40.733 "uuid": "513d80ad-9b96-43c6-b63c-1f344d95776a", 00:27:40.733 "assigned_rate_limits": { 00:27:40.733 "rw_ios_per_sec": 0, 00:27:40.733 "rw_mbytes_per_sec": 0, 00:27:40.733 "r_mbytes_per_sec": 0, 00:27:40.733 "w_mbytes_per_sec": 0 00:27:40.733 }, 00:27:40.733 "claimed": true, 00:27:40.733 "claim_type": "exclusive_write", 00:27:40.733 "zoned": false, 00:27:40.733 "supported_io_types": { 00:27:40.733 "read": true, 00:27:40.733 "write": true, 00:27:40.733 "unmap": true, 00:27:40.733 "flush": true, 00:27:40.733 "reset": true, 00:27:40.733 "nvme_admin": false, 00:27:40.733 "nvme_io": false, 00:27:40.733 "nvme_io_md": false, 00:27:40.733 "write_zeroes": true, 00:27:40.733 "zcopy": true, 00:27:40.733 "get_zone_info": false, 00:27:40.733 "zone_management": false, 00:27:40.733 "zone_append": false, 00:27:40.733 "compare": false, 00:27:40.733 "compare_and_write": false, 00:27:40.733 "abort": true, 00:27:40.733 "seek_hole": false, 00:27:40.733 "seek_data": false, 00:27:40.733 "copy": true, 00:27:40.733 "nvme_iov_md": false 00:27:40.733 }, 00:27:40.733 "memory_domains": [ 00:27:40.733 { 00:27:40.733 "dma_device_id": "system", 00:27:40.733 "dma_device_type": 1 00:27:40.733 }, 00:27:40.733 { 00:27:40.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:40.733 "dma_device_type": 2 00:27:40.733 } 00:27:40.733 ], 00:27:40.733 "driver_specific": {} 00:27:40.733 }' 00:27:40.733 20:41:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:40.733 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:40.733 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:40.733 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:40.733 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:40.993 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:40.993 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:40.993 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:40.993 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:40.993 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:40.993 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:40.993 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:40.993 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:40.993 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:40.993 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:41.253 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:41.253 "name": "BaseBdev2", 00:27:41.253 "aliases": [ 00:27:41.253 "4b67dc92-4830-4c8f-a507-cf352aa8b699" 00:27:41.253 ], 00:27:41.253 "product_name": "Malloc disk", 00:27:41.253 "block_size": 4096, 00:27:41.253 "num_blocks": 8192, 00:27:41.253 "uuid": "4b67dc92-4830-4c8f-a507-cf352aa8b699", 00:27:41.253 "assigned_rate_limits": { 00:27:41.253 "rw_ios_per_sec": 0, 00:27:41.253 "rw_mbytes_per_sec": 0, 00:27:41.253 "r_mbytes_per_sec": 0, 00:27:41.253 "w_mbytes_per_sec": 0 00:27:41.253 }, 00:27:41.253 "claimed": true, 00:27:41.253 "claim_type": "exclusive_write", 00:27:41.253 "zoned": false, 00:27:41.253 "supported_io_types": { 00:27:41.253 "read": true, 00:27:41.253 "write": true, 00:27:41.253 "unmap": true, 00:27:41.253 "flush": true, 00:27:41.253 "reset": true, 00:27:41.253 "nvme_admin": false, 00:27:41.253 "nvme_io": false, 00:27:41.253 "nvme_io_md": false, 00:27:41.253 "write_zeroes": true, 00:27:41.253 "zcopy": true, 00:27:41.253 "get_zone_info": false, 00:27:41.253 "zone_management": false, 00:27:41.253 "zone_append": false, 00:27:41.253 "compare": false, 00:27:41.253 "compare_and_write": false, 00:27:41.253 "abort": true, 00:27:41.253 "seek_hole": false, 00:27:41.253 "seek_data": false, 00:27:41.253 "copy": true, 00:27:41.253 "nvme_iov_md": false 00:27:41.253 }, 00:27:41.253 "memory_domains": [ 00:27:41.253 { 00:27:41.253 "dma_device_id": "system", 00:27:41.253 "dma_device_type": 1 00:27:41.253 }, 00:27:41.253 { 00:27:41.253 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:41.253 "dma_device_type": 2 00:27:41.253 } 00:27:41.253 ], 00:27:41.253 "driver_specific": {} 00:27:41.253 }' 00:27:41.253 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:41.253 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:41.511 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:41.511 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:41.511 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:41.511 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:41.511 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:41.511 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:41.511 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:41.511 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:41.770 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:41.770 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:41.770 20:41:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:42.029 [2024-07-15 20:41:34.163163] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.029 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:42.288 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:42.288 "name": "Existed_Raid", 00:27:42.288 "uuid": "824647e7-5008-4214-a776-bd6678b00c03", 00:27:42.288 "strip_size_kb": 0, 00:27:42.288 "state": "online", 00:27:42.288 "raid_level": "raid1", 00:27:42.288 "superblock": true, 00:27:42.288 "num_base_bdevs": 2, 00:27:42.288 "num_base_bdevs_discovered": 1, 00:27:42.288 "num_base_bdevs_operational": 1, 00:27:42.288 "base_bdevs_list": [ 00:27:42.288 { 00:27:42.288 "name": null, 00:27:42.288 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:42.288 "is_configured": false, 00:27:42.288 "data_offset": 256, 00:27:42.288 "data_size": 7936 00:27:42.288 }, 00:27:42.288 { 00:27:42.288 "name": "BaseBdev2", 00:27:42.288 "uuid": "4b67dc92-4830-4c8f-a507-cf352aa8b699", 00:27:42.288 "is_configured": true, 00:27:42.288 "data_offset": 256, 00:27:42.288 "data_size": 7936 00:27:42.288 } 00:27:42.288 ] 00:27:42.288 }' 00:27:42.288 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:42.288 20:41:34 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:42.855 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:42.855 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:42.855 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.855 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:43.113 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:43.113 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:43.113 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:43.113 [2024-07-15 20:41:35.487758] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:43.113 [2024-07-15 20:41:35.487838] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:43.372 [2024-07-15 20:41:35.498798] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:43.372 [2024-07-15 20:41:35.498833] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:43.372 [2024-07-15 20:41:35.498844] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c6c000 name Existed_Raid, state offline 00:27:43.372 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:43.372 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:43.372 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.372 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1492755 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 1492755 ']' 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 1492755 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1492755 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1492755' 00:27:43.631 killing process with pid 1492755 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 1492755 00:27:43.631 [2024-07-15 20:41:35.826778] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:43.631 20:41:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 1492755 00:27:43.631 [2024-07-15 20:41:35.827741] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:43.890 20:41:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:27:43.890 00:27:43.890 real 0m10.893s 00:27:43.890 user 0m19.399s 00:27:43.890 sys 0m2.017s 00:27:43.890 20:41:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:43.890 20:41:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:43.890 ************************************ 00:27:43.890 END TEST raid_state_function_test_sb_4k 00:27:43.890 ************************************ 00:27:43.890 20:41:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:43.890 20:41:36 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:27:43.890 20:41:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:43.890 20:41:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:43.890 20:41:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:43.890 ************************************ 00:27:43.890 START TEST raid_superblock_test_4k 00:27:43.890 ************************************ 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=1494395 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 1494395 /var/tmp/spdk-raid.sock 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 1494395 ']' 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:43.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:43.890 20:41:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:43.890 [2024-07-15 20:41:36.206442] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:27:43.890 [2024-07-15 20:41:36.206516] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1494395 ] 00:27:44.150 [2024-07-15 20:41:36.340480] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:44.150 [2024-07-15 20:41:36.441903] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:44.150 [2024-07-15 20:41:36.506812] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:44.150 [2024-07-15 20:41:36.506852] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:27:45.086 malloc1 00:27:45.086 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:45.345 [2024-07-15 20:41:37.625500] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:45.345 [2024-07-15 20:41:37.625553] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:45.345 [2024-07-15 20:41:37.625573] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ec4570 00:27:45.345 [2024-07-15 20:41:37.625586] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:45.345 [2024-07-15 20:41:37.627217] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:45.345 [2024-07-15 20:41:37.627246] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:45.345 pt1 00:27:45.345 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:45.345 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:45.345 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:45.345 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:45.345 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:45.345 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:45.345 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:45.345 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:45.345 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:27:45.603 malloc2 00:27:45.604 20:41:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:45.862 [2024-07-15 20:41:38.123613] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:45.862 [2024-07-15 20:41:38.123658] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:45.862 [2024-07-15 20:41:38.123676] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ec5970 00:27:45.862 [2024-07-15 20:41:38.123688] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:45.862 [2024-07-15 20:41:38.125268] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:45.862 [2024-07-15 20:41:38.125297] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:45.862 pt2 00:27:45.862 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:45.862 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:45.862 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:46.121 [2024-07-15 20:41:38.372298] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:46.121 [2024-07-15 20:41:38.373586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:46.121 [2024-07-15 20:41:38.373736] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2068270 00:27:46.121 [2024-07-15 20:41:38.373748] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:46.121 [2024-07-15 20:41:38.373963] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ebc0e0 00:27:46.121 [2024-07-15 20:41:38.374111] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2068270 00:27:46.121 [2024-07-15 20:41:38.374122] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2068270 00:27:46.121 [2024-07-15 20:41:38.374219] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.121 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.380 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:46.380 "name": "raid_bdev1", 00:27:46.380 "uuid": "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8", 00:27:46.380 "strip_size_kb": 0, 00:27:46.380 "state": "online", 00:27:46.380 "raid_level": "raid1", 00:27:46.380 "superblock": true, 00:27:46.380 "num_base_bdevs": 2, 00:27:46.380 "num_base_bdevs_discovered": 2, 00:27:46.380 "num_base_bdevs_operational": 2, 00:27:46.380 "base_bdevs_list": [ 00:27:46.380 { 00:27:46.380 "name": "pt1", 00:27:46.380 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:46.380 "is_configured": true, 00:27:46.380 "data_offset": 256, 00:27:46.380 "data_size": 7936 00:27:46.380 }, 00:27:46.380 { 00:27:46.380 "name": "pt2", 00:27:46.380 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:46.380 "is_configured": true, 00:27:46.380 "data_offset": 256, 00:27:46.380 "data_size": 7936 00:27:46.380 } 00:27:46.380 ] 00:27:46.380 }' 00:27:46.380 20:41:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:46.380 20:41:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:46.947 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:46.947 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:46.947 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:46.947 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:46.947 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:46.947 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:46.947 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:46.947 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:47.204 [2024-07-15 20:41:39.471433] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:47.204 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:47.204 "name": "raid_bdev1", 00:27:47.204 "aliases": [ 00:27:47.204 "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8" 00:27:47.204 ], 00:27:47.204 "product_name": "Raid Volume", 00:27:47.204 "block_size": 4096, 00:27:47.204 "num_blocks": 7936, 00:27:47.204 "uuid": "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8", 00:27:47.204 "assigned_rate_limits": { 00:27:47.204 "rw_ios_per_sec": 0, 00:27:47.204 "rw_mbytes_per_sec": 0, 00:27:47.204 "r_mbytes_per_sec": 0, 00:27:47.204 "w_mbytes_per_sec": 0 00:27:47.204 }, 00:27:47.204 "claimed": false, 00:27:47.204 "zoned": false, 00:27:47.204 "supported_io_types": { 00:27:47.204 "read": true, 00:27:47.204 "write": true, 00:27:47.204 "unmap": false, 00:27:47.204 "flush": false, 00:27:47.204 "reset": true, 00:27:47.204 "nvme_admin": false, 00:27:47.204 "nvme_io": false, 00:27:47.204 "nvme_io_md": false, 00:27:47.204 "write_zeroes": true, 00:27:47.204 "zcopy": false, 00:27:47.204 "get_zone_info": false, 00:27:47.204 "zone_management": false, 00:27:47.204 "zone_append": false, 00:27:47.204 "compare": false, 00:27:47.204 "compare_and_write": false, 00:27:47.204 "abort": false, 00:27:47.204 "seek_hole": false, 00:27:47.204 "seek_data": false, 00:27:47.204 "copy": false, 00:27:47.204 "nvme_iov_md": false 00:27:47.204 }, 00:27:47.204 "memory_domains": [ 00:27:47.204 { 00:27:47.204 "dma_device_id": "system", 00:27:47.204 "dma_device_type": 1 00:27:47.204 }, 00:27:47.205 { 00:27:47.205 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:47.205 "dma_device_type": 2 00:27:47.205 }, 00:27:47.205 { 00:27:47.205 "dma_device_id": "system", 00:27:47.205 "dma_device_type": 1 00:27:47.205 }, 00:27:47.205 { 00:27:47.205 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:47.205 "dma_device_type": 2 00:27:47.205 } 00:27:47.205 ], 00:27:47.205 "driver_specific": { 00:27:47.205 "raid": { 00:27:47.205 "uuid": "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8", 00:27:47.205 "strip_size_kb": 0, 00:27:47.205 "state": "online", 00:27:47.205 "raid_level": "raid1", 00:27:47.205 "superblock": true, 00:27:47.205 "num_base_bdevs": 2, 00:27:47.205 "num_base_bdevs_discovered": 2, 00:27:47.205 "num_base_bdevs_operational": 2, 00:27:47.205 "base_bdevs_list": [ 00:27:47.205 { 00:27:47.205 "name": "pt1", 00:27:47.205 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:47.205 "is_configured": true, 00:27:47.205 "data_offset": 256, 00:27:47.205 "data_size": 7936 00:27:47.205 }, 00:27:47.205 { 00:27:47.205 "name": "pt2", 00:27:47.205 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:47.205 "is_configured": true, 00:27:47.205 "data_offset": 256, 00:27:47.205 "data_size": 7936 00:27:47.205 } 00:27:47.205 ] 00:27:47.205 } 00:27:47.205 } 00:27:47.205 }' 00:27:47.205 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:47.205 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:47.205 pt2' 00:27:47.205 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:47.205 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:47.205 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:47.462 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:47.462 "name": "pt1", 00:27:47.462 "aliases": [ 00:27:47.462 "00000000-0000-0000-0000-000000000001" 00:27:47.462 ], 00:27:47.462 "product_name": "passthru", 00:27:47.462 "block_size": 4096, 00:27:47.462 "num_blocks": 8192, 00:27:47.462 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:47.462 "assigned_rate_limits": { 00:27:47.462 "rw_ios_per_sec": 0, 00:27:47.462 "rw_mbytes_per_sec": 0, 00:27:47.462 "r_mbytes_per_sec": 0, 00:27:47.462 "w_mbytes_per_sec": 0 00:27:47.462 }, 00:27:47.462 "claimed": true, 00:27:47.462 "claim_type": "exclusive_write", 00:27:47.462 "zoned": false, 00:27:47.462 "supported_io_types": { 00:27:47.462 "read": true, 00:27:47.462 "write": true, 00:27:47.462 "unmap": true, 00:27:47.462 "flush": true, 00:27:47.462 "reset": true, 00:27:47.462 "nvme_admin": false, 00:27:47.462 "nvme_io": false, 00:27:47.462 "nvme_io_md": false, 00:27:47.462 "write_zeroes": true, 00:27:47.462 "zcopy": true, 00:27:47.462 "get_zone_info": false, 00:27:47.462 "zone_management": false, 00:27:47.462 "zone_append": false, 00:27:47.462 "compare": false, 00:27:47.462 "compare_and_write": false, 00:27:47.462 "abort": true, 00:27:47.462 "seek_hole": false, 00:27:47.462 "seek_data": false, 00:27:47.462 "copy": true, 00:27:47.462 "nvme_iov_md": false 00:27:47.462 }, 00:27:47.462 "memory_domains": [ 00:27:47.462 { 00:27:47.462 "dma_device_id": "system", 00:27:47.462 "dma_device_type": 1 00:27:47.462 }, 00:27:47.462 { 00:27:47.462 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:47.462 "dma_device_type": 2 00:27:47.462 } 00:27:47.462 ], 00:27:47.462 "driver_specific": { 00:27:47.462 "passthru": { 00:27:47.462 "name": "pt1", 00:27:47.462 "base_bdev_name": "malloc1" 00:27:47.462 } 00:27:47.462 } 00:27:47.462 }' 00:27:47.462 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:47.462 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:47.720 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:47.720 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:47.720 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:47.720 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:47.720 20:41:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:47.720 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:47.720 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:47.720 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:48.083 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:48.083 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:48.083 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:48.083 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:48.083 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:48.083 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:48.083 "name": "pt2", 00:27:48.083 "aliases": [ 00:27:48.083 "00000000-0000-0000-0000-000000000002" 00:27:48.083 ], 00:27:48.083 "product_name": "passthru", 00:27:48.083 "block_size": 4096, 00:27:48.083 "num_blocks": 8192, 00:27:48.083 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:48.083 "assigned_rate_limits": { 00:27:48.083 "rw_ios_per_sec": 0, 00:27:48.083 "rw_mbytes_per_sec": 0, 00:27:48.083 "r_mbytes_per_sec": 0, 00:27:48.083 "w_mbytes_per_sec": 0 00:27:48.083 }, 00:27:48.083 "claimed": true, 00:27:48.083 "claim_type": "exclusive_write", 00:27:48.083 "zoned": false, 00:27:48.083 "supported_io_types": { 00:27:48.083 "read": true, 00:27:48.083 "write": true, 00:27:48.083 "unmap": true, 00:27:48.083 "flush": true, 00:27:48.083 "reset": true, 00:27:48.083 "nvme_admin": false, 00:27:48.083 "nvme_io": false, 00:27:48.083 "nvme_io_md": false, 00:27:48.083 "write_zeroes": true, 00:27:48.083 "zcopy": true, 00:27:48.083 "get_zone_info": false, 00:27:48.083 "zone_management": false, 00:27:48.083 "zone_append": false, 00:27:48.083 "compare": false, 00:27:48.083 "compare_and_write": false, 00:27:48.083 "abort": true, 00:27:48.083 "seek_hole": false, 00:27:48.083 "seek_data": false, 00:27:48.083 "copy": true, 00:27:48.083 "nvme_iov_md": false 00:27:48.083 }, 00:27:48.083 "memory_domains": [ 00:27:48.083 { 00:27:48.083 "dma_device_id": "system", 00:27:48.083 "dma_device_type": 1 00:27:48.083 }, 00:27:48.083 { 00:27:48.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:48.083 "dma_device_type": 2 00:27:48.083 } 00:27:48.083 ], 00:27:48.083 "driver_specific": { 00:27:48.083 "passthru": { 00:27:48.083 "name": "pt2", 00:27:48.083 "base_bdev_name": "malloc2" 00:27:48.083 } 00:27:48.083 } 00:27:48.083 }' 00:27:48.083 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:48.342 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:48.600 [2024-07-15 20:41:40.895341] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:48.600 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8 00:27:48.600 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8 ']' 00:27:48.600 20:41:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:48.858 [2024-07-15 20:41:41.147779] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:48.858 [2024-07-15 20:41:41.147796] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:48.858 [2024-07-15 20:41:41.147850] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:48.858 [2024-07-15 20:41:41.147906] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:48.858 [2024-07-15 20:41:41.147918] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2068270 name raid_bdev1, state offline 00:27:48.858 20:41:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.858 20:41:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:49.115 20:41:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:49.115 20:41:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:49.115 20:41:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:49.115 20:41:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:49.374 20:41:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:49.374 20:41:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:49.631 20:41:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:49.631 20:41:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:49.890 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:50.148 [2024-07-15 20:41:42.387029] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:50.148 [2024-07-15 20:41:42.388435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:50.148 [2024-07-15 20:41:42.388495] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:50.148 [2024-07-15 20:41:42.388539] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:50.148 [2024-07-15 20:41:42.388558] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:50.148 [2024-07-15 20:41:42.388568] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2067ff0 name raid_bdev1, state configuring 00:27:50.148 request: 00:27:50.148 { 00:27:50.148 "name": "raid_bdev1", 00:27:50.148 "raid_level": "raid1", 00:27:50.148 "base_bdevs": [ 00:27:50.148 "malloc1", 00:27:50.148 "malloc2" 00:27:50.148 ], 00:27:50.148 "superblock": false, 00:27:50.148 "method": "bdev_raid_create", 00:27:50.148 "req_id": 1 00:27:50.148 } 00:27:50.148 Got JSON-RPC error response 00:27:50.148 response: 00:27:50.148 { 00:27:50.148 "code": -17, 00:27:50.148 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:50.148 } 00:27:50.148 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:27:50.148 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:50.148 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:50.148 20:41:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:50.148 20:41:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.148 20:41:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:50.723 20:41:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:50.723 20:41:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:50.723 20:41:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:50.981 [2024-07-15 20:41:43.148950] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:50.981 [2024-07-15 20:41:43.148994] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:50.981 [2024-07-15 20:41:43.149016] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ec47a0 00:27:50.981 [2024-07-15 20:41:43.149029] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:50.981 [2024-07-15 20:41:43.150645] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:50.981 [2024-07-15 20:41:43.150675] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:50.981 [2024-07-15 20:41:43.150743] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:50.981 [2024-07-15 20:41:43.150768] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:50.981 pt1 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.982 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:51.239 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:51.239 "name": "raid_bdev1", 00:27:51.239 "uuid": "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8", 00:27:51.239 "strip_size_kb": 0, 00:27:51.239 "state": "configuring", 00:27:51.239 "raid_level": "raid1", 00:27:51.239 "superblock": true, 00:27:51.239 "num_base_bdevs": 2, 00:27:51.239 "num_base_bdevs_discovered": 1, 00:27:51.239 "num_base_bdevs_operational": 2, 00:27:51.239 "base_bdevs_list": [ 00:27:51.239 { 00:27:51.239 "name": "pt1", 00:27:51.239 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:51.239 "is_configured": true, 00:27:51.239 "data_offset": 256, 00:27:51.239 "data_size": 7936 00:27:51.239 }, 00:27:51.239 { 00:27:51.239 "name": null, 00:27:51.239 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:51.239 "is_configured": false, 00:27:51.239 "data_offset": 256, 00:27:51.239 "data_size": 7936 00:27:51.239 } 00:27:51.239 ] 00:27:51.239 }' 00:27:51.239 20:41:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:51.239 20:41:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:51.804 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:51.804 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:51.804 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:51.804 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:51.804 [2024-07-15 20:41:44.179822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:51.804 [2024-07-15 20:41:44.179874] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:51.804 [2024-07-15 20:41:44.179893] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x205c6f0 00:27:51.804 [2024-07-15 20:41:44.179913] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:51.804 [2024-07-15 20:41:44.180287] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:51.804 [2024-07-15 20:41:44.180306] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:51.804 [2024-07-15 20:41:44.180371] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:51.804 [2024-07-15 20:41:44.180390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:51.804 [2024-07-15 20:41:44.180491] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x205d590 00:27:51.804 [2024-07-15 20:41:44.180502] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:51.804 [2024-07-15 20:41:44.180669] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ebe540 00:27:51.804 [2024-07-15 20:41:44.180792] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x205d590 00:27:51.804 [2024-07-15 20:41:44.180802] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x205d590 00:27:51.804 [2024-07-15 20:41:44.180898] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:52.078 pt2 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:52.078 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.078 "name": "raid_bdev1", 00:27:52.078 "uuid": "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8", 00:27:52.078 "strip_size_kb": 0, 00:27:52.078 "state": "online", 00:27:52.078 "raid_level": "raid1", 00:27:52.078 "superblock": true, 00:27:52.079 "num_base_bdevs": 2, 00:27:52.079 "num_base_bdevs_discovered": 2, 00:27:52.079 "num_base_bdevs_operational": 2, 00:27:52.079 "base_bdevs_list": [ 00:27:52.079 { 00:27:52.079 "name": "pt1", 00:27:52.079 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:52.079 "is_configured": true, 00:27:52.079 "data_offset": 256, 00:27:52.079 "data_size": 7936 00:27:52.079 }, 00:27:52.079 { 00:27:52.079 "name": "pt2", 00:27:52.079 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:52.079 "is_configured": true, 00:27:52.079 "data_offset": 256, 00:27:52.079 "data_size": 7936 00:27:52.079 } 00:27:52.079 ] 00:27:52.079 }' 00:27:52.079 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.079 20:41:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:52.645 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:52.645 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:52.645 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:52.645 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:52.645 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:52.645 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:52.645 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:52.645 20:41:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:52.904 [2024-07-15 20:41:45.154653] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:52.905 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:52.905 "name": "raid_bdev1", 00:27:52.905 "aliases": [ 00:27:52.905 "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8" 00:27:52.905 ], 00:27:52.905 "product_name": "Raid Volume", 00:27:52.905 "block_size": 4096, 00:27:52.905 "num_blocks": 7936, 00:27:52.905 "uuid": "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8", 00:27:52.905 "assigned_rate_limits": { 00:27:52.905 "rw_ios_per_sec": 0, 00:27:52.905 "rw_mbytes_per_sec": 0, 00:27:52.905 "r_mbytes_per_sec": 0, 00:27:52.905 "w_mbytes_per_sec": 0 00:27:52.905 }, 00:27:52.905 "claimed": false, 00:27:52.905 "zoned": false, 00:27:52.905 "supported_io_types": { 00:27:52.905 "read": true, 00:27:52.905 "write": true, 00:27:52.905 "unmap": false, 00:27:52.905 "flush": false, 00:27:52.905 "reset": true, 00:27:52.905 "nvme_admin": false, 00:27:52.905 "nvme_io": false, 00:27:52.905 "nvme_io_md": false, 00:27:52.905 "write_zeroes": true, 00:27:52.905 "zcopy": false, 00:27:52.905 "get_zone_info": false, 00:27:52.905 "zone_management": false, 00:27:52.905 "zone_append": false, 00:27:52.905 "compare": false, 00:27:52.905 "compare_and_write": false, 00:27:52.905 "abort": false, 00:27:52.905 "seek_hole": false, 00:27:52.905 "seek_data": false, 00:27:52.905 "copy": false, 00:27:52.905 "nvme_iov_md": false 00:27:52.905 }, 00:27:52.905 "memory_domains": [ 00:27:52.905 { 00:27:52.905 "dma_device_id": "system", 00:27:52.905 "dma_device_type": 1 00:27:52.905 }, 00:27:52.905 { 00:27:52.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:52.905 "dma_device_type": 2 00:27:52.905 }, 00:27:52.905 { 00:27:52.905 "dma_device_id": "system", 00:27:52.905 "dma_device_type": 1 00:27:52.905 }, 00:27:52.905 { 00:27:52.905 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:52.905 "dma_device_type": 2 00:27:52.905 } 00:27:52.905 ], 00:27:52.905 "driver_specific": { 00:27:52.905 "raid": { 00:27:52.905 "uuid": "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8", 00:27:52.905 "strip_size_kb": 0, 00:27:52.905 "state": "online", 00:27:52.905 "raid_level": "raid1", 00:27:52.905 "superblock": true, 00:27:52.905 "num_base_bdevs": 2, 00:27:52.905 "num_base_bdevs_discovered": 2, 00:27:52.905 "num_base_bdevs_operational": 2, 00:27:52.905 "base_bdevs_list": [ 00:27:52.905 { 00:27:52.905 "name": "pt1", 00:27:52.905 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:52.905 "is_configured": true, 00:27:52.905 "data_offset": 256, 00:27:52.905 "data_size": 7936 00:27:52.905 }, 00:27:52.905 { 00:27:52.905 "name": "pt2", 00:27:52.905 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:52.905 "is_configured": true, 00:27:52.905 "data_offset": 256, 00:27:52.905 "data_size": 7936 00:27:52.905 } 00:27:52.905 ] 00:27:52.905 } 00:27:52.905 } 00:27:52.905 }' 00:27:52.905 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:52.905 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:52.905 pt2' 00:27:52.905 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:52.905 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:52.905 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:53.164 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:53.164 "name": "pt1", 00:27:53.164 "aliases": [ 00:27:53.164 "00000000-0000-0000-0000-000000000001" 00:27:53.164 ], 00:27:53.164 "product_name": "passthru", 00:27:53.165 "block_size": 4096, 00:27:53.165 "num_blocks": 8192, 00:27:53.165 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:53.165 "assigned_rate_limits": { 00:27:53.165 "rw_ios_per_sec": 0, 00:27:53.165 "rw_mbytes_per_sec": 0, 00:27:53.165 "r_mbytes_per_sec": 0, 00:27:53.165 "w_mbytes_per_sec": 0 00:27:53.165 }, 00:27:53.165 "claimed": true, 00:27:53.165 "claim_type": "exclusive_write", 00:27:53.165 "zoned": false, 00:27:53.165 "supported_io_types": { 00:27:53.165 "read": true, 00:27:53.165 "write": true, 00:27:53.165 "unmap": true, 00:27:53.165 "flush": true, 00:27:53.165 "reset": true, 00:27:53.165 "nvme_admin": false, 00:27:53.165 "nvme_io": false, 00:27:53.165 "nvme_io_md": false, 00:27:53.165 "write_zeroes": true, 00:27:53.165 "zcopy": true, 00:27:53.165 "get_zone_info": false, 00:27:53.165 "zone_management": false, 00:27:53.165 "zone_append": false, 00:27:53.165 "compare": false, 00:27:53.165 "compare_and_write": false, 00:27:53.165 "abort": true, 00:27:53.165 "seek_hole": false, 00:27:53.165 "seek_data": false, 00:27:53.165 "copy": true, 00:27:53.165 "nvme_iov_md": false 00:27:53.165 }, 00:27:53.165 "memory_domains": [ 00:27:53.165 { 00:27:53.165 "dma_device_id": "system", 00:27:53.165 "dma_device_type": 1 00:27:53.165 }, 00:27:53.165 { 00:27:53.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:53.165 "dma_device_type": 2 00:27:53.165 } 00:27:53.165 ], 00:27:53.165 "driver_specific": { 00:27:53.165 "passthru": { 00:27:53.165 "name": "pt1", 00:27:53.165 "base_bdev_name": "malloc1" 00:27:53.165 } 00:27:53.165 } 00:27:53.165 }' 00:27:53.165 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:53.165 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:53.424 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:53.424 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:53.424 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:53.424 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:53.424 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:53.424 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:53.424 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:53.424 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:53.424 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:53.682 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:53.682 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:53.682 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:53.682 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:53.682 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:53.682 "name": "pt2", 00:27:53.682 "aliases": [ 00:27:53.682 "00000000-0000-0000-0000-000000000002" 00:27:53.682 ], 00:27:53.682 "product_name": "passthru", 00:27:53.682 "block_size": 4096, 00:27:53.682 "num_blocks": 8192, 00:27:53.682 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:53.682 "assigned_rate_limits": { 00:27:53.682 "rw_ios_per_sec": 0, 00:27:53.682 "rw_mbytes_per_sec": 0, 00:27:53.682 "r_mbytes_per_sec": 0, 00:27:53.682 "w_mbytes_per_sec": 0 00:27:53.682 }, 00:27:53.682 "claimed": true, 00:27:53.682 "claim_type": "exclusive_write", 00:27:53.682 "zoned": false, 00:27:53.682 "supported_io_types": { 00:27:53.682 "read": true, 00:27:53.682 "write": true, 00:27:53.682 "unmap": true, 00:27:53.682 "flush": true, 00:27:53.682 "reset": true, 00:27:53.682 "nvme_admin": false, 00:27:53.682 "nvme_io": false, 00:27:53.682 "nvme_io_md": false, 00:27:53.682 "write_zeroes": true, 00:27:53.682 "zcopy": true, 00:27:53.682 "get_zone_info": false, 00:27:53.682 "zone_management": false, 00:27:53.682 "zone_append": false, 00:27:53.682 "compare": false, 00:27:53.682 "compare_and_write": false, 00:27:53.682 "abort": true, 00:27:53.682 "seek_hole": false, 00:27:53.682 "seek_data": false, 00:27:53.682 "copy": true, 00:27:53.682 "nvme_iov_md": false 00:27:53.682 }, 00:27:53.682 "memory_domains": [ 00:27:53.682 { 00:27:53.682 "dma_device_id": "system", 00:27:53.682 "dma_device_type": 1 00:27:53.682 }, 00:27:53.682 { 00:27:53.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:53.682 "dma_device_type": 2 00:27:53.682 } 00:27:53.682 ], 00:27:53.682 "driver_specific": { 00:27:53.682 "passthru": { 00:27:53.682 "name": "pt2", 00:27:53.682 "base_bdev_name": "malloc2" 00:27:53.682 } 00:27:53.682 } 00:27:53.682 }' 00:27:53.682 20:41:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:53.682 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:53.940 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:54.199 [2024-07-15 20:41:46.470157] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:54.199 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8 '!=' bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8 ']' 00:27:54.199 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:54.199 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:54.199 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:54.199 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:54.458 [2024-07-15 20:41:46.722618] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.458 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.716 20:41:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.716 "name": "raid_bdev1", 00:27:54.716 "uuid": "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8", 00:27:54.716 "strip_size_kb": 0, 00:27:54.716 "state": "online", 00:27:54.716 "raid_level": "raid1", 00:27:54.716 "superblock": true, 00:27:54.716 "num_base_bdevs": 2, 00:27:54.716 "num_base_bdevs_discovered": 1, 00:27:54.716 "num_base_bdevs_operational": 1, 00:27:54.716 "base_bdevs_list": [ 00:27:54.716 { 00:27:54.716 "name": null, 00:27:54.716 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.716 "is_configured": false, 00:27:54.716 "data_offset": 256, 00:27:54.716 "data_size": 7936 00:27:54.716 }, 00:27:54.716 { 00:27:54.716 "name": "pt2", 00:27:54.716 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:54.716 "is_configured": true, 00:27:54.716 "data_offset": 256, 00:27:54.716 "data_size": 7936 00:27:54.716 } 00:27:54.716 ] 00:27:54.716 }' 00:27:54.716 20:41:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.717 20:41:47 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:55.284 20:41:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:55.543 [2024-07-15 20:41:47.757349] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:55.543 [2024-07-15 20:41:47.757376] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:55.543 [2024-07-15 20:41:47.757429] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:55.543 [2024-07-15 20:41:47.757469] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:55.543 [2024-07-15 20:41:47.757487] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x205d590 name raid_bdev1, state offline 00:27:55.543 20:41:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.543 20:41:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:55.802 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:55.802 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:55.802 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:55.802 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:55.802 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:56.060 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:56.060 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:56.060 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:56.060 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:56.060 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:27:56.060 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:56.319 [2024-07-15 20:41:48.443129] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:56.319 [2024-07-15 20:41:48.443173] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:56.319 [2024-07-15 20:41:48.443190] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ec5160 00:27:56.319 [2024-07-15 20:41:48.443203] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:56.319 [2024-07-15 20:41:48.444833] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:56.319 [2024-07-15 20:41:48.444859] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:56.319 [2024-07-15 20:41:48.444933] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:56.319 [2024-07-15 20:41:48.444960] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:56.319 [2024-07-15 20:41:48.445047] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ebb380 00:27:56.319 [2024-07-15 20:41:48.445057] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:56.319 [2024-07-15 20:41:48.445229] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ebca80 00:27:56.319 [2024-07-15 20:41:48.445350] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ebb380 00:27:56.319 [2024-07-15 20:41:48.445360] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ebb380 00:27:56.319 [2024-07-15 20:41:48.445454] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:56.319 pt2 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.319 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:56.319 "name": "raid_bdev1", 00:27:56.319 "uuid": "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8", 00:27:56.319 "strip_size_kb": 0, 00:27:56.319 "state": "online", 00:27:56.319 "raid_level": "raid1", 00:27:56.319 "superblock": true, 00:27:56.319 "num_base_bdevs": 2, 00:27:56.319 "num_base_bdevs_discovered": 1, 00:27:56.319 "num_base_bdevs_operational": 1, 00:27:56.319 "base_bdevs_list": [ 00:27:56.319 { 00:27:56.319 "name": null, 00:27:56.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:56.319 "is_configured": false, 00:27:56.319 "data_offset": 256, 00:27:56.319 "data_size": 7936 00:27:56.319 }, 00:27:56.319 { 00:27:56.319 "name": "pt2", 00:27:56.319 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:56.320 "is_configured": true, 00:27:56.320 "data_offset": 256, 00:27:56.320 "data_size": 7936 00:27:56.320 } 00:27:56.320 ] 00:27:56.320 }' 00:27:56.320 20:41:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:56.320 20:41:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:56.886 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:57.145 [2024-07-15 20:41:49.461825] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:57.145 [2024-07-15 20:41:49.461852] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:57.145 [2024-07-15 20:41:49.461904] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:57.145 [2024-07-15 20:41:49.461954] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:57.145 [2024-07-15 20:41:49.461966] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ebb380 name raid_bdev1, state offline 00:27:57.145 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.145 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:57.404 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:57.404 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:57.404 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:57.404 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:57.663 [2024-07-15 20:41:49.967148] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:57.663 [2024-07-15 20:41:49.967191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:57.663 [2024-07-15 20:41:49.967208] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2067520 00:27:57.663 [2024-07-15 20:41:49.967221] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:57.663 [2024-07-15 20:41:49.968831] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:57.663 [2024-07-15 20:41:49.968859] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:57.663 [2024-07-15 20:41:49.968921] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:57.663 [2024-07-15 20:41:49.968955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:57.663 [2024-07-15 20:41:49.969050] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:57.663 [2024-07-15 20:41:49.969063] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:57.663 [2024-07-15 20:41:49.969076] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ebc3f0 name raid_bdev1, state configuring 00:27:57.663 [2024-07-15 20:41:49.969105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:57.663 [2024-07-15 20:41:49.969163] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ebe2b0 00:27:57.663 [2024-07-15 20:41:49.969174] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:57.663 [2024-07-15 20:41:49.969332] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ebb350 00:27:57.663 [2024-07-15 20:41:49.969451] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ebe2b0 00:27:57.663 [2024-07-15 20:41:49.969461] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ebe2b0 00:27:57.663 [2024-07-15 20:41:49.969557] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:57.663 pt1 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.663 20:41:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.922 20:41:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:57.922 "name": "raid_bdev1", 00:27:57.922 "uuid": "bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8", 00:27:57.922 "strip_size_kb": 0, 00:27:57.922 "state": "online", 00:27:57.922 "raid_level": "raid1", 00:27:57.922 "superblock": true, 00:27:57.922 "num_base_bdevs": 2, 00:27:57.922 "num_base_bdevs_discovered": 1, 00:27:57.922 "num_base_bdevs_operational": 1, 00:27:57.922 "base_bdevs_list": [ 00:27:57.922 { 00:27:57.922 "name": null, 00:27:57.922 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:57.922 "is_configured": false, 00:27:57.922 "data_offset": 256, 00:27:57.922 "data_size": 7936 00:27:57.922 }, 00:27:57.922 { 00:27:57.922 "name": "pt2", 00:27:57.922 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:57.922 "is_configured": true, 00:27:57.922 "data_offset": 256, 00:27:57.922 "data_size": 7936 00:27:57.922 } 00:27:57.922 ] 00:27:57.922 }' 00:27:57.922 20:41:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:57.922 20:41:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:58.488 20:41:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:58.488 20:41:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:58.747 20:41:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:58.747 20:41:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:58.747 20:41:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:59.006 [2024-07-15 20:41:51.330996] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:59.006 20:41:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8 '!=' bb3dfb8b-8605-4bea-8d3e-b3bdcffe64a8 ']' 00:27:59.006 20:41:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 1494395 00:27:59.006 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 1494395 ']' 00:27:59.006 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 1494395 00:27:59.006 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:27:59.006 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:59.006 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1494395 00:27:59.265 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:59.265 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:59.265 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1494395' 00:27:59.265 killing process with pid 1494395 00:27:59.265 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 1494395 00:27:59.265 [2024-07-15 20:41:51.400319] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:59.265 [2024-07-15 20:41:51.400377] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:59.265 [2024-07-15 20:41:51.400421] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:59.265 [2024-07-15 20:41:51.400433] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ebe2b0 name raid_bdev1, state offline 00:27:59.265 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 1494395 00:27:59.265 [2024-07-15 20:41:51.419585] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:59.265 20:41:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:27:59.265 00:27:59.265 real 0m15.501s 00:27:59.265 user 0m28.039s 00:27:59.265 sys 0m2.937s 00:27:59.265 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:59.265 20:41:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:59.265 ************************************ 00:27:59.265 END TEST raid_superblock_test_4k 00:27:59.265 ************************************ 00:27:59.525 20:41:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:59.525 20:41:51 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:27:59.525 20:41:51 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:27:59.525 20:41:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:59.525 20:41:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:59.525 20:41:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:59.525 ************************************ 00:27:59.525 START TEST raid_rebuild_test_sb_4k 00:27:59.525 ************************************ 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=1496641 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 1496641 /var/tmp/spdk-raid.sock 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 1496641 ']' 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:59.525 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:59.525 20:41:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:59.525 [2024-07-15 20:41:51.798306] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:27:59.525 [2024-07-15 20:41:51.798373] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1496641 ] 00:27:59.525 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:59.525 Zero copy mechanism will not be used. 00:27:59.783 [2024-07-15 20:41:51.919985] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:59.783 [2024-07-15 20:41:52.019555] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:59.783 [2024-07-15 20:41:52.076355] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:59.783 [2024-07-15 20:41:52.076383] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:00.717 20:41:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:00.717 20:41:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:28:00.717 20:41:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:00.717 20:41:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:28:00.717 BaseBdev1_malloc 00:28:00.717 20:41:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:00.976 [2024-07-15 20:41:53.203389] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:00.976 [2024-07-15 20:41:53.203433] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:00.976 [2024-07-15 20:41:53.203467] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x146cd40 00:28:00.976 [2024-07-15 20:41:53.203480] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:00.976 [2024-07-15 20:41:53.205253] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:00.976 [2024-07-15 20:41:53.205281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:00.976 BaseBdev1 00:28:00.976 20:41:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:00.976 20:41:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:28:01.234 BaseBdev2_malloc 00:28:01.234 20:41:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:01.492 [2024-07-15 20:41:53.698466] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:01.492 [2024-07-15 20:41:53.698512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:01.492 [2024-07-15 20:41:53.698537] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x146d860 00:28:01.492 [2024-07-15 20:41:53.698550] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:01.492 [2024-07-15 20:41:53.700080] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:01.492 [2024-07-15 20:41:53.700108] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:01.492 BaseBdev2 00:28:01.492 20:41:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:28:01.750 spare_malloc 00:28:01.750 20:41:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:02.009 spare_delay 00:28:02.009 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:02.268 [2024-07-15 20:41:54.432973] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:02.268 [2024-07-15 20:41:54.433017] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:02.268 [2024-07-15 20:41:54.433039] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x161bec0 00:28:02.268 [2024-07-15 20:41:54.433052] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:02.268 [2024-07-15 20:41:54.434667] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:02.268 [2024-07-15 20:41:54.434694] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:02.268 spare 00:28:02.268 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:02.526 [2024-07-15 20:41:54.665607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:02.526 [2024-07-15 20:41:54.666953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:02.526 [2024-07-15 20:41:54.667123] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x161d070 00:28:02.526 [2024-07-15 20:41:54.667136] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:02.526 [2024-07-15 20:41:54.667334] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1616490 00:28:02.526 [2024-07-15 20:41:54.667475] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x161d070 00:28:02.526 [2024-07-15 20:41:54.667485] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x161d070 00:28:02.526 [2024-07-15 20:41:54.667590] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.526 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.785 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:02.785 "name": "raid_bdev1", 00:28:02.785 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:02.785 "strip_size_kb": 0, 00:28:02.785 "state": "online", 00:28:02.785 "raid_level": "raid1", 00:28:02.785 "superblock": true, 00:28:02.785 "num_base_bdevs": 2, 00:28:02.785 "num_base_bdevs_discovered": 2, 00:28:02.785 "num_base_bdevs_operational": 2, 00:28:02.785 "base_bdevs_list": [ 00:28:02.785 { 00:28:02.785 "name": "BaseBdev1", 00:28:02.785 "uuid": "aec2811d-20ae-5e30-bf0b-af1694df6da3", 00:28:02.785 "is_configured": true, 00:28:02.785 "data_offset": 256, 00:28:02.785 "data_size": 7936 00:28:02.785 }, 00:28:02.785 { 00:28:02.785 "name": "BaseBdev2", 00:28:02.785 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:02.785 "is_configured": true, 00:28:02.785 "data_offset": 256, 00:28:02.785 "data_size": 7936 00:28:02.785 } 00:28:02.785 ] 00:28:02.785 }' 00:28:02.785 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:02.785 20:41:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:03.351 20:41:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:03.351 20:41:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:03.609 [2024-07-15 20:41:55.796829] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:03.609 20:41:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:03.609 20:41:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.609 20:41:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:03.868 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:04.126 [2024-07-15 20:41:56.289945] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1616490 00:28:04.126 /dev/nbd0 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:04.126 1+0 records in 00:28:04.126 1+0 records out 00:28:04.126 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275797 s, 14.9 MB/s 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:28:04.126 20:41:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:04.734 7936+0 records in 00:28:04.734 7936+0 records out 00:28:04.734 32505856 bytes (33 MB, 31 MiB) copied, 0.757521 s, 42.9 MB/s 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:05.022 [2024-07-15 20:41:57.388394] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:05.022 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:05.303 [2024-07-15 20:41:57.576608] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.303 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:05.561 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:05.562 "name": "raid_bdev1", 00:28:05.562 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:05.562 "strip_size_kb": 0, 00:28:05.562 "state": "online", 00:28:05.562 "raid_level": "raid1", 00:28:05.562 "superblock": true, 00:28:05.562 "num_base_bdevs": 2, 00:28:05.562 "num_base_bdevs_discovered": 1, 00:28:05.562 "num_base_bdevs_operational": 1, 00:28:05.562 "base_bdevs_list": [ 00:28:05.562 { 00:28:05.562 "name": null, 00:28:05.562 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:05.562 "is_configured": false, 00:28:05.562 "data_offset": 256, 00:28:05.562 "data_size": 7936 00:28:05.562 }, 00:28:05.562 { 00:28:05.562 "name": "BaseBdev2", 00:28:05.562 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:05.562 "is_configured": true, 00:28:05.562 "data_offset": 256, 00:28:05.562 "data_size": 7936 00:28:05.562 } 00:28:05.562 ] 00:28:05.562 }' 00:28:05.562 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:05.562 20:41:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:06.125 20:41:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:06.383 [2024-07-15 20:41:58.675521] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:06.383 [2024-07-15 20:41:58.680544] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x161cce0 00:28:06.383 [2024-07-15 20:41:58.682767] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:06.383 20:41:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:07.760 20:41:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:07.760 20:41:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:07.760 20:41:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:07.760 20:41:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:07.760 20:41:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:07.760 20:41:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.760 20:41:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.760 20:41:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:07.760 "name": "raid_bdev1", 00:28:07.760 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:07.760 "strip_size_kb": 0, 00:28:07.760 "state": "online", 00:28:07.760 "raid_level": "raid1", 00:28:07.760 "superblock": true, 00:28:07.760 "num_base_bdevs": 2, 00:28:07.760 "num_base_bdevs_discovered": 2, 00:28:07.760 "num_base_bdevs_operational": 2, 00:28:07.760 "process": { 00:28:07.760 "type": "rebuild", 00:28:07.760 "target": "spare", 00:28:07.760 "progress": { 00:28:07.760 "blocks": 3072, 00:28:07.760 "percent": 38 00:28:07.760 } 00:28:07.760 }, 00:28:07.760 "base_bdevs_list": [ 00:28:07.760 { 00:28:07.760 "name": "spare", 00:28:07.760 "uuid": "3e9dcecd-56ec-51f7-83f9-3bafcff2d71d", 00:28:07.760 "is_configured": true, 00:28:07.760 "data_offset": 256, 00:28:07.760 "data_size": 7936 00:28:07.760 }, 00:28:07.760 { 00:28:07.760 "name": "BaseBdev2", 00:28:07.760 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:07.760 "is_configured": true, 00:28:07.760 "data_offset": 256, 00:28:07.760 "data_size": 7936 00:28:07.760 } 00:28:07.760 ] 00:28:07.760 }' 00:28:07.760 20:41:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:07.760 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:07.760 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:07.760 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:07.760 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:08.018 [2024-07-15 20:42:00.274406] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:08.018 [2024-07-15 20:42:00.295407] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:08.018 [2024-07-15 20:42:00.295454] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:08.018 [2024-07-15 20:42:00.295470] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:08.018 [2024-07-15 20:42:00.295479] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:08.018 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:08.018 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:08.018 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:08.018 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:08.018 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:08.018 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:08.018 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:08.019 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:08.019 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:08.019 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:08.019 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.019 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:08.277 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:08.277 "name": "raid_bdev1", 00:28:08.277 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:08.277 "strip_size_kb": 0, 00:28:08.277 "state": "online", 00:28:08.277 "raid_level": "raid1", 00:28:08.277 "superblock": true, 00:28:08.277 "num_base_bdevs": 2, 00:28:08.277 "num_base_bdevs_discovered": 1, 00:28:08.277 "num_base_bdevs_operational": 1, 00:28:08.277 "base_bdevs_list": [ 00:28:08.277 { 00:28:08.277 "name": null, 00:28:08.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:08.277 "is_configured": false, 00:28:08.277 "data_offset": 256, 00:28:08.277 "data_size": 7936 00:28:08.277 }, 00:28:08.277 { 00:28:08.277 "name": "BaseBdev2", 00:28:08.277 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:08.277 "is_configured": true, 00:28:08.277 "data_offset": 256, 00:28:08.277 "data_size": 7936 00:28:08.277 } 00:28:08.277 ] 00:28:08.277 }' 00:28:08.277 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:08.277 20:42:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:08.844 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:08.844 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:08.844 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:08.844 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:08.844 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:08.844 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.844 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:09.102 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:09.102 "name": "raid_bdev1", 00:28:09.102 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:09.102 "strip_size_kb": 0, 00:28:09.102 "state": "online", 00:28:09.102 "raid_level": "raid1", 00:28:09.102 "superblock": true, 00:28:09.102 "num_base_bdevs": 2, 00:28:09.102 "num_base_bdevs_discovered": 1, 00:28:09.102 "num_base_bdevs_operational": 1, 00:28:09.102 "base_bdevs_list": [ 00:28:09.102 { 00:28:09.102 "name": null, 00:28:09.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:09.102 "is_configured": false, 00:28:09.102 "data_offset": 256, 00:28:09.102 "data_size": 7936 00:28:09.102 }, 00:28:09.102 { 00:28:09.103 "name": "BaseBdev2", 00:28:09.103 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:09.103 "is_configured": true, 00:28:09.103 "data_offset": 256, 00:28:09.103 "data_size": 7936 00:28:09.103 } 00:28:09.103 ] 00:28:09.103 }' 00:28:09.103 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:09.103 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:09.103 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:09.361 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:09.361 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:09.620 [2024-07-15 20:42:01.744362] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:09.620 [2024-07-15 20:42:01.749828] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x161cce0 00:28:09.620 [2024-07-15 20:42:01.751340] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:09.620 20:42:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:10.555 20:42:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:10.555 20:42:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:10.555 20:42:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:10.555 20:42:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:10.555 20:42:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:10.555 20:42:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.555 20:42:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:10.814 "name": "raid_bdev1", 00:28:10.814 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:10.814 "strip_size_kb": 0, 00:28:10.814 "state": "online", 00:28:10.814 "raid_level": "raid1", 00:28:10.814 "superblock": true, 00:28:10.814 "num_base_bdevs": 2, 00:28:10.814 "num_base_bdevs_discovered": 2, 00:28:10.814 "num_base_bdevs_operational": 2, 00:28:10.814 "process": { 00:28:10.814 "type": "rebuild", 00:28:10.814 "target": "spare", 00:28:10.814 "progress": { 00:28:10.814 "blocks": 3072, 00:28:10.814 "percent": 38 00:28:10.814 } 00:28:10.814 }, 00:28:10.814 "base_bdevs_list": [ 00:28:10.814 { 00:28:10.814 "name": "spare", 00:28:10.814 "uuid": "3e9dcecd-56ec-51f7-83f9-3bafcff2d71d", 00:28:10.814 "is_configured": true, 00:28:10.814 "data_offset": 256, 00:28:10.814 "data_size": 7936 00:28:10.814 }, 00:28:10.814 { 00:28:10.814 "name": "BaseBdev2", 00:28:10.814 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:10.814 "is_configured": true, 00:28:10.814 "data_offset": 256, 00:28:10.814 "data_size": 7936 00:28:10.814 } 00:28:10.814 ] 00:28:10.814 }' 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:10.814 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1061 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.814 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:11.073 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:11.073 "name": "raid_bdev1", 00:28:11.073 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:11.073 "strip_size_kb": 0, 00:28:11.073 "state": "online", 00:28:11.073 "raid_level": "raid1", 00:28:11.073 "superblock": true, 00:28:11.073 "num_base_bdevs": 2, 00:28:11.073 "num_base_bdevs_discovered": 2, 00:28:11.073 "num_base_bdevs_operational": 2, 00:28:11.073 "process": { 00:28:11.073 "type": "rebuild", 00:28:11.073 "target": "spare", 00:28:11.073 "progress": { 00:28:11.073 "blocks": 3840, 00:28:11.073 "percent": 48 00:28:11.074 } 00:28:11.074 }, 00:28:11.074 "base_bdevs_list": [ 00:28:11.074 { 00:28:11.074 "name": "spare", 00:28:11.074 "uuid": "3e9dcecd-56ec-51f7-83f9-3bafcff2d71d", 00:28:11.074 "is_configured": true, 00:28:11.074 "data_offset": 256, 00:28:11.074 "data_size": 7936 00:28:11.074 }, 00:28:11.074 { 00:28:11.074 "name": "BaseBdev2", 00:28:11.074 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:11.074 "is_configured": true, 00:28:11.074 "data_offset": 256, 00:28:11.074 "data_size": 7936 00:28:11.074 } 00:28:11.074 ] 00:28:11.074 }' 00:28:11.074 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:11.074 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:11.074 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:11.334 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:11.334 20:42:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:12.271 20:42:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:12.271 20:42:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:12.271 20:42:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:12.271 20:42:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:12.271 20:42:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:12.271 20:42:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:12.271 20:42:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.271 20:42:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:12.530 [2024-07-15 20:42:04.875443] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:12.530 [2024-07-15 20:42:04.875510] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:12.530 [2024-07-15 20:42:04.875591] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:12.789 20:42:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:12.789 "name": "raid_bdev1", 00:28:12.789 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:12.789 "strip_size_kb": 0, 00:28:12.789 "state": "online", 00:28:12.789 "raid_level": "raid1", 00:28:12.789 "superblock": true, 00:28:12.789 "num_base_bdevs": 2, 00:28:12.789 "num_base_bdevs_discovered": 2, 00:28:12.789 "num_base_bdevs_operational": 2, 00:28:12.789 "base_bdevs_list": [ 00:28:12.789 { 00:28:12.789 "name": "spare", 00:28:12.789 "uuid": "3e9dcecd-56ec-51f7-83f9-3bafcff2d71d", 00:28:12.789 "is_configured": true, 00:28:12.789 "data_offset": 256, 00:28:12.789 "data_size": 7936 00:28:12.789 }, 00:28:12.789 { 00:28:12.789 "name": "BaseBdev2", 00:28:12.789 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:12.789 "is_configured": true, 00:28:12.789 "data_offset": 256, 00:28:12.789 "data_size": 7936 00:28:12.789 } 00:28:12.789 ] 00:28:12.789 }' 00:28:12.789 20:42:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.789 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:13.048 "name": "raid_bdev1", 00:28:13.048 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:13.048 "strip_size_kb": 0, 00:28:13.048 "state": "online", 00:28:13.048 "raid_level": "raid1", 00:28:13.048 "superblock": true, 00:28:13.048 "num_base_bdevs": 2, 00:28:13.048 "num_base_bdevs_discovered": 2, 00:28:13.048 "num_base_bdevs_operational": 2, 00:28:13.048 "base_bdevs_list": [ 00:28:13.048 { 00:28:13.048 "name": "spare", 00:28:13.048 "uuid": "3e9dcecd-56ec-51f7-83f9-3bafcff2d71d", 00:28:13.048 "is_configured": true, 00:28:13.048 "data_offset": 256, 00:28:13.048 "data_size": 7936 00:28:13.048 }, 00:28:13.048 { 00:28:13.048 "name": "BaseBdev2", 00:28:13.048 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:13.048 "is_configured": true, 00:28:13.048 "data_offset": 256, 00:28:13.048 "data_size": 7936 00:28:13.048 } 00:28:13.048 ] 00:28:13.048 }' 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:13.048 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:13.307 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.307 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.307 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:13.307 "name": "raid_bdev1", 00:28:13.307 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:13.307 "strip_size_kb": 0, 00:28:13.307 "state": "online", 00:28:13.307 "raid_level": "raid1", 00:28:13.307 "superblock": true, 00:28:13.307 "num_base_bdevs": 2, 00:28:13.307 "num_base_bdevs_discovered": 2, 00:28:13.307 "num_base_bdevs_operational": 2, 00:28:13.307 "base_bdevs_list": [ 00:28:13.307 { 00:28:13.307 "name": "spare", 00:28:13.307 "uuid": "3e9dcecd-56ec-51f7-83f9-3bafcff2d71d", 00:28:13.307 "is_configured": true, 00:28:13.307 "data_offset": 256, 00:28:13.307 "data_size": 7936 00:28:13.307 }, 00:28:13.307 { 00:28:13.307 "name": "BaseBdev2", 00:28:13.307 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:13.307 "is_configured": true, 00:28:13.307 "data_offset": 256, 00:28:13.307 "data_size": 7936 00:28:13.307 } 00:28:13.307 ] 00:28:13.307 }' 00:28:13.307 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:13.307 20:42:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:14.243 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:14.243 [2024-07-15 20:42:06.520337] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:14.243 [2024-07-15 20:42:06.520365] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:14.243 [2024-07-15 20:42:06.520425] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:14.243 [2024-07-15 20:42:06.520481] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:14.243 [2024-07-15 20:42:06.520493] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x161d070 name raid_bdev1, state offline 00:28:14.243 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.243 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:14.502 20:42:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:14.761 /dev/nbd0 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:14.761 1+0 records in 00:28:14.761 1+0 records out 00:28:14.761 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025336 s, 16.2 MB/s 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:14.761 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:15.020 /dev/nbd1 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:15.020 1+0 records in 00:28:15.020 1+0 records out 00:28:15.020 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000346479 s, 11.8 MB/s 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:15.020 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:15.279 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:15.845 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:15.845 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:15.845 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:15.845 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:15.845 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:15.845 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:15.845 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:15.845 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:15.845 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:15.845 20:42:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:15.845 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:16.104 [2024-07-15 20:42:08.337311] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:16.104 [2024-07-15 20:42:08.337358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:16.104 [2024-07-15 20:42:08.337384] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1619fe0 00:28:16.104 [2024-07-15 20:42:08.337398] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:16.104 [2024-07-15 20:42:08.339016] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:16.104 [2024-07-15 20:42:08.339045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:16.104 [2024-07-15 20:42:08.339124] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:16.104 [2024-07-15 20:42:08.339150] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:16.104 [2024-07-15 20:42:08.339247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:16.104 spare 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.104 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:16.104 [2024-07-15 20:42:08.439561] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x161b260 00:28:16.104 [2024-07-15 20:42:08.439577] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:16.104 [2024-07-15 20:42:08.439772] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x161b540 00:28:16.104 [2024-07-15 20:42:08.439920] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x161b260 00:28:16.104 [2024-07-15 20:42:08.439937] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x161b260 00:28:16.105 [2024-07-15 20:42:08.440043] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:16.363 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:16.363 "name": "raid_bdev1", 00:28:16.363 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:16.363 "strip_size_kb": 0, 00:28:16.363 "state": "online", 00:28:16.363 "raid_level": "raid1", 00:28:16.363 "superblock": true, 00:28:16.363 "num_base_bdevs": 2, 00:28:16.363 "num_base_bdevs_discovered": 2, 00:28:16.363 "num_base_bdevs_operational": 2, 00:28:16.363 "base_bdevs_list": [ 00:28:16.363 { 00:28:16.363 "name": "spare", 00:28:16.363 "uuid": "3e9dcecd-56ec-51f7-83f9-3bafcff2d71d", 00:28:16.363 "is_configured": true, 00:28:16.363 "data_offset": 256, 00:28:16.363 "data_size": 7936 00:28:16.363 }, 00:28:16.363 { 00:28:16.363 "name": "BaseBdev2", 00:28:16.363 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:16.363 "is_configured": true, 00:28:16.363 "data_offset": 256, 00:28:16.363 "data_size": 7936 00:28:16.363 } 00:28:16.363 ] 00:28:16.363 }' 00:28:16.363 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:16.363 20:42:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:16.930 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:16.930 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:16.930 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:16.930 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:16.930 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:16.930 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.930 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.189 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:17.189 "name": "raid_bdev1", 00:28:17.189 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:17.189 "strip_size_kb": 0, 00:28:17.189 "state": "online", 00:28:17.189 "raid_level": "raid1", 00:28:17.189 "superblock": true, 00:28:17.189 "num_base_bdevs": 2, 00:28:17.189 "num_base_bdevs_discovered": 2, 00:28:17.189 "num_base_bdevs_operational": 2, 00:28:17.189 "base_bdevs_list": [ 00:28:17.189 { 00:28:17.189 "name": "spare", 00:28:17.189 "uuid": "3e9dcecd-56ec-51f7-83f9-3bafcff2d71d", 00:28:17.189 "is_configured": true, 00:28:17.189 "data_offset": 256, 00:28:17.189 "data_size": 7936 00:28:17.189 }, 00:28:17.189 { 00:28:17.189 "name": "BaseBdev2", 00:28:17.189 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:17.189 "is_configured": true, 00:28:17.189 "data_offset": 256, 00:28:17.189 "data_size": 7936 00:28:17.189 } 00:28:17.189 ] 00:28:17.189 }' 00:28:17.189 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:17.189 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:17.189 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:17.189 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:17.189 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.189 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:17.448 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:17.448 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:17.707 [2024-07-15 20:42:09.921620] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.707 20:42:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.966 20:42:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:17.966 "name": "raid_bdev1", 00:28:17.966 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:17.966 "strip_size_kb": 0, 00:28:17.966 "state": "online", 00:28:17.966 "raid_level": "raid1", 00:28:17.966 "superblock": true, 00:28:17.966 "num_base_bdevs": 2, 00:28:17.966 "num_base_bdevs_discovered": 1, 00:28:17.966 "num_base_bdevs_operational": 1, 00:28:17.966 "base_bdevs_list": [ 00:28:17.966 { 00:28:17.966 "name": null, 00:28:17.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.966 "is_configured": false, 00:28:17.966 "data_offset": 256, 00:28:17.966 "data_size": 7936 00:28:17.966 }, 00:28:17.966 { 00:28:17.966 "name": "BaseBdev2", 00:28:17.966 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:17.966 "is_configured": true, 00:28:17.966 "data_offset": 256, 00:28:17.966 "data_size": 7936 00:28:17.966 } 00:28:17.966 ] 00:28:17.967 }' 00:28:17.967 20:42:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:17.967 20:42:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:18.533 20:42:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:18.792 [2024-07-15 20:42:11.052642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:18.792 [2024-07-15 20:42:11.052792] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:18.792 [2024-07-15 20:42:11.052809] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:18.792 [2024-07-15 20:42:11.052836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:18.792 [2024-07-15 20:42:11.057690] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x161a270 00:28:18.792 [2024-07-15 20:42:11.060050] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:18.792 20:42:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:19.728 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:19.728 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:19.728 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:19.728 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:19.728 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:19.728 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.728 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.987 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:19.987 "name": "raid_bdev1", 00:28:19.987 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:19.987 "strip_size_kb": 0, 00:28:19.987 "state": "online", 00:28:19.987 "raid_level": "raid1", 00:28:19.987 "superblock": true, 00:28:19.987 "num_base_bdevs": 2, 00:28:19.987 "num_base_bdevs_discovered": 2, 00:28:19.987 "num_base_bdevs_operational": 2, 00:28:19.987 "process": { 00:28:19.987 "type": "rebuild", 00:28:19.987 "target": "spare", 00:28:19.987 "progress": { 00:28:19.987 "blocks": 3072, 00:28:19.987 "percent": 38 00:28:19.987 } 00:28:19.987 }, 00:28:19.987 "base_bdevs_list": [ 00:28:19.987 { 00:28:19.987 "name": "spare", 00:28:19.987 "uuid": "3e9dcecd-56ec-51f7-83f9-3bafcff2d71d", 00:28:19.987 "is_configured": true, 00:28:19.987 "data_offset": 256, 00:28:19.987 "data_size": 7936 00:28:19.987 }, 00:28:19.987 { 00:28:19.987 "name": "BaseBdev2", 00:28:19.987 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:19.987 "is_configured": true, 00:28:19.987 "data_offset": 256, 00:28:19.987 "data_size": 7936 00:28:19.987 } 00:28:19.987 ] 00:28:19.987 }' 00:28:19.987 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:20.246 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:20.246 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:20.246 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:20.246 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:20.504 [2024-07-15 20:42:12.658980] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:20.504 [2024-07-15 20:42:12.672764] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:20.504 [2024-07-15 20:42:12.672807] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:20.504 [2024-07-15 20:42:12.672823] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:20.504 [2024-07-15 20:42:12.672831] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.504 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.763 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:20.763 "name": "raid_bdev1", 00:28:20.763 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:20.763 "strip_size_kb": 0, 00:28:20.763 "state": "online", 00:28:20.763 "raid_level": "raid1", 00:28:20.763 "superblock": true, 00:28:20.763 "num_base_bdevs": 2, 00:28:20.763 "num_base_bdevs_discovered": 1, 00:28:20.763 "num_base_bdevs_operational": 1, 00:28:20.763 "base_bdevs_list": [ 00:28:20.763 { 00:28:20.763 "name": null, 00:28:20.763 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.763 "is_configured": false, 00:28:20.763 "data_offset": 256, 00:28:20.763 "data_size": 7936 00:28:20.763 }, 00:28:20.763 { 00:28:20.763 "name": "BaseBdev2", 00:28:20.763 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:20.763 "is_configured": true, 00:28:20.763 "data_offset": 256, 00:28:20.763 "data_size": 7936 00:28:20.763 } 00:28:20.763 ] 00:28:20.763 }' 00:28:20.763 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:20.763 20:42:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:21.329 20:42:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:21.588 [2024-07-15 20:42:13.804258] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:21.588 [2024-07-15 20:42:13.804306] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:21.588 [2024-07-15 20:42:13.804327] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1463f00 00:28:21.588 [2024-07-15 20:42:13.804339] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:21.588 [2024-07-15 20:42:13.804707] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:21.588 [2024-07-15 20:42:13.804725] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:21.588 [2024-07-15 20:42:13.804803] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:21.588 [2024-07-15 20:42:13.804815] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:21.588 [2024-07-15 20:42:13.804826] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:21.588 [2024-07-15 20:42:13.804845] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:21.588 [2024-07-15 20:42:13.809652] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1464190 00:28:21.588 spare 00:28:21.588 [2024-07-15 20:42:13.811125] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:21.588 20:42:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:22.559 20:42:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:22.559 20:42:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:22.560 20:42:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:22.560 20:42:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:22.560 20:42:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:22.560 20:42:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:22.560 20:42:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.817 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:22.817 "name": "raid_bdev1", 00:28:22.817 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:22.817 "strip_size_kb": 0, 00:28:22.817 "state": "online", 00:28:22.817 "raid_level": "raid1", 00:28:22.817 "superblock": true, 00:28:22.817 "num_base_bdevs": 2, 00:28:22.817 "num_base_bdevs_discovered": 2, 00:28:22.817 "num_base_bdevs_operational": 2, 00:28:22.817 "process": { 00:28:22.817 "type": "rebuild", 00:28:22.817 "target": "spare", 00:28:22.817 "progress": { 00:28:22.817 "blocks": 3072, 00:28:22.817 "percent": 38 00:28:22.817 } 00:28:22.817 }, 00:28:22.817 "base_bdevs_list": [ 00:28:22.817 { 00:28:22.817 "name": "spare", 00:28:22.817 "uuid": "3e9dcecd-56ec-51f7-83f9-3bafcff2d71d", 00:28:22.817 "is_configured": true, 00:28:22.817 "data_offset": 256, 00:28:22.817 "data_size": 7936 00:28:22.817 }, 00:28:22.817 { 00:28:22.817 "name": "BaseBdev2", 00:28:22.817 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:22.817 "is_configured": true, 00:28:22.817 "data_offset": 256, 00:28:22.817 "data_size": 7936 00:28:22.817 } 00:28:22.817 ] 00:28:22.817 }' 00:28:22.817 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:22.817 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:22.817 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:22.817 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:22.817 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:23.075 [2024-07-15 20:42:15.410579] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:23.075 [2024-07-15 20:42:15.423453] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:23.075 [2024-07-15 20:42:15.423498] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:23.075 [2024-07-15 20:42:15.423513] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:23.075 [2024-07-15 20:42:15.423522] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:23.075 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:23.075 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:23.075 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:23.075 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.075 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.075 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:23.075 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.075 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.075 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.075 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.333 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.333 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.333 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.333 "name": "raid_bdev1", 00:28:23.333 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:23.333 "strip_size_kb": 0, 00:28:23.333 "state": "online", 00:28:23.333 "raid_level": "raid1", 00:28:23.333 "superblock": true, 00:28:23.333 "num_base_bdevs": 2, 00:28:23.333 "num_base_bdevs_discovered": 1, 00:28:23.333 "num_base_bdevs_operational": 1, 00:28:23.333 "base_bdevs_list": [ 00:28:23.333 { 00:28:23.333 "name": null, 00:28:23.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.333 "is_configured": false, 00:28:23.333 "data_offset": 256, 00:28:23.333 "data_size": 7936 00:28:23.333 }, 00:28:23.333 { 00:28:23.333 "name": "BaseBdev2", 00:28:23.333 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:23.333 "is_configured": true, 00:28:23.333 "data_offset": 256, 00:28:23.333 "data_size": 7936 00:28:23.333 } 00:28:23.333 ] 00:28:23.333 }' 00:28:23.333 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.333 20:42:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:24.265 "name": "raid_bdev1", 00:28:24.265 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:24.265 "strip_size_kb": 0, 00:28:24.265 "state": "online", 00:28:24.265 "raid_level": "raid1", 00:28:24.265 "superblock": true, 00:28:24.265 "num_base_bdevs": 2, 00:28:24.265 "num_base_bdevs_discovered": 1, 00:28:24.265 "num_base_bdevs_operational": 1, 00:28:24.265 "base_bdevs_list": [ 00:28:24.265 { 00:28:24.265 "name": null, 00:28:24.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:24.265 "is_configured": false, 00:28:24.265 "data_offset": 256, 00:28:24.265 "data_size": 7936 00:28:24.265 }, 00:28:24.265 { 00:28:24.265 "name": "BaseBdev2", 00:28:24.265 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:24.265 "is_configured": true, 00:28:24.265 "data_offset": 256, 00:28:24.265 "data_size": 7936 00:28:24.265 } 00:28:24.265 ] 00:28:24.265 }' 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:24.265 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:24.523 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:24.523 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:24.523 20:42:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:24.780 [2024-07-15 20:42:17.104551] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:24.780 [2024-07-15 20:42:17.104597] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:24.780 [2024-07-15 20:42:17.104617] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x161d2f0 00:28:24.780 [2024-07-15 20:42:17.104630] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:24.780 [2024-07-15 20:42:17.104970] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:24.780 [2024-07-15 20:42:17.104993] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:24.780 [2024-07-15 20:42:17.105057] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:24.780 [2024-07-15 20:42:17.105070] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:24.780 [2024-07-15 20:42:17.105080] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:24.780 BaseBdev1 00:28:24.780 20:42:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:26.153 "name": "raid_bdev1", 00:28:26.153 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:26.153 "strip_size_kb": 0, 00:28:26.153 "state": "online", 00:28:26.153 "raid_level": "raid1", 00:28:26.153 "superblock": true, 00:28:26.153 "num_base_bdevs": 2, 00:28:26.153 "num_base_bdevs_discovered": 1, 00:28:26.153 "num_base_bdevs_operational": 1, 00:28:26.153 "base_bdevs_list": [ 00:28:26.153 { 00:28:26.153 "name": null, 00:28:26.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.153 "is_configured": false, 00:28:26.153 "data_offset": 256, 00:28:26.153 "data_size": 7936 00:28:26.153 }, 00:28:26.153 { 00:28:26.153 "name": "BaseBdev2", 00:28:26.153 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:26.153 "is_configured": true, 00:28:26.153 "data_offset": 256, 00:28:26.153 "data_size": 7936 00:28:26.153 } 00:28:26.153 ] 00:28:26.153 }' 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:26.153 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:26.719 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:26.719 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:26.719 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:26.719 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:26.719 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:26.719 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.719 20:42:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:26.977 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:26.977 "name": "raid_bdev1", 00:28:26.977 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:26.977 "strip_size_kb": 0, 00:28:26.977 "state": "online", 00:28:26.977 "raid_level": "raid1", 00:28:26.977 "superblock": true, 00:28:26.977 "num_base_bdevs": 2, 00:28:26.977 "num_base_bdevs_discovered": 1, 00:28:26.977 "num_base_bdevs_operational": 1, 00:28:26.977 "base_bdevs_list": [ 00:28:26.977 { 00:28:26.977 "name": null, 00:28:26.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.977 "is_configured": false, 00:28:26.977 "data_offset": 256, 00:28:26.977 "data_size": 7936 00:28:26.978 }, 00:28:26.978 { 00:28:26.978 "name": "BaseBdev2", 00:28:26.978 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:26.978 "is_configured": true, 00:28:26.978 "data_offset": 256, 00:28:26.978 "data_size": 7936 00:28:26.978 } 00:28:26.978 ] 00:28:26.978 }' 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:26.978 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:27.236 [2024-07-15 20:42:19.510971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:27.236 [2024-07-15 20:42:19.511095] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:27.236 [2024-07-15 20:42:19.511110] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:27.236 request: 00:28:27.236 { 00:28:27.236 "base_bdev": "BaseBdev1", 00:28:27.236 "raid_bdev": "raid_bdev1", 00:28:27.236 "method": "bdev_raid_add_base_bdev", 00:28:27.236 "req_id": 1 00:28:27.236 } 00:28:27.236 Got JSON-RPC error response 00:28:27.236 response: 00:28:27.236 { 00:28:27.236 "code": -22, 00:28:27.236 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:27.236 } 00:28:27.236 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:28:27.236 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:27.236 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:27.236 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:27.236 20:42:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:28.612 "name": "raid_bdev1", 00:28:28.612 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:28.612 "strip_size_kb": 0, 00:28:28.612 "state": "online", 00:28:28.612 "raid_level": "raid1", 00:28:28.612 "superblock": true, 00:28:28.612 "num_base_bdevs": 2, 00:28:28.612 "num_base_bdevs_discovered": 1, 00:28:28.612 "num_base_bdevs_operational": 1, 00:28:28.612 "base_bdevs_list": [ 00:28:28.612 { 00:28:28.612 "name": null, 00:28:28.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:28.612 "is_configured": false, 00:28:28.612 "data_offset": 256, 00:28:28.612 "data_size": 7936 00:28:28.612 }, 00:28:28.612 { 00:28:28.612 "name": "BaseBdev2", 00:28:28.612 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:28.612 "is_configured": true, 00:28:28.612 "data_offset": 256, 00:28:28.612 "data_size": 7936 00:28:28.612 } 00:28:28.612 ] 00:28:28.612 }' 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:28.612 20:42:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:29.178 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:29.178 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:29.178 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:29.178 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:29.178 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:29.178 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.178 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:29.438 "name": "raid_bdev1", 00:28:29.438 "uuid": "662801c5-523b-41fe-aaa7-2c5dd04ee37f", 00:28:29.438 "strip_size_kb": 0, 00:28:29.438 "state": "online", 00:28:29.438 "raid_level": "raid1", 00:28:29.438 "superblock": true, 00:28:29.438 "num_base_bdevs": 2, 00:28:29.438 "num_base_bdevs_discovered": 1, 00:28:29.438 "num_base_bdevs_operational": 1, 00:28:29.438 "base_bdevs_list": [ 00:28:29.438 { 00:28:29.438 "name": null, 00:28:29.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:29.438 "is_configured": false, 00:28:29.438 "data_offset": 256, 00:28:29.438 "data_size": 7936 00:28:29.438 }, 00:28:29.438 { 00:28:29.438 "name": "BaseBdev2", 00:28:29.438 "uuid": "4aa63f0d-7c8f-5b03-a468-69e9fc1b36d5", 00:28:29.438 "is_configured": true, 00:28:29.438 "data_offset": 256, 00:28:29.438 "data_size": 7936 00:28:29.438 } 00:28:29.438 ] 00:28:29.438 }' 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 1496641 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 1496641 ']' 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 1496641 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1496641 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1496641' 00:28:29.438 killing process with pid 1496641 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 1496641 00:28:29.438 Received shutdown signal, test time was about 60.000000 seconds 00:28:29.438 00:28:29.438 Latency(us) 00:28:29.438 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:29.438 =================================================================================================================== 00:28:29.438 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:29.438 [2024-07-15 20:42:21.812452] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:29.438 [2024-07-15 20:42:21.812546] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:29.438 [2024-07-15 20:42:21.812588] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:29.438 [2024-07-15 20:42:21.812601] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x161b260 name raid_bdev1, state offline 00:28:29.438 20:42:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 1496641 00:28:29.697 [2024-07-15 20:42:21.839786] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:29.697 20:42:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:28:29.697 00:28:29.697 real 0m30.329s 00:28:29.697 user 0m47.888s 00:28:29.697 sys 0m5.064s 00:28:29.697 20:42:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:29.697 20:42:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:29.697 ************************************ 00:28:29.697 END TEST raid_rebuild_test_sb_4k 00:28:29.697 ************************************ 00:28:29.955 20:42:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:29.955 20:42:22 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:28:29.955 20:42:22 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:28:29.955 20:42:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:29.955 20:42:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:29.955 20:42:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:29.955 ************************************ 00:28:29.955 START TEST raid_state_function_test_sb_md_separate 00:28:29.955 ************************************ 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1501460 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1501460' 00:28:29.955 Process raid pid: 1501460 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1501460 /var/tmp/spdk-raid.sock 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1501460 ']' 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:29.955 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:29.955 20:42:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:29.955 [2024-07-15 20:42:22.222422] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:28:29.956 [2024-07-15 20:42:22.222494] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:30.214 [2024-07-15 20:42:22.352937] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:30.214 [2024-07-15 20:42:22.458787] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:30.214 [2024-07-15 20:42:22.525618] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:30.214 [2024-07-15 20:42:22.525647] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:30.781 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:30.781 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:30.781 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:31.042 [2024-07-15 20:42:23.316334] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:31.042 [2024-07-15 20:42:23.316379] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:31.042 [2024-07-15 20:42:23.316390] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:31.042 [2024-07-15 20:42:23.316402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.042 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:31.302 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.302 "name": "Existed_Raid", 00:28:31.302 "uuid": "586faf57-9193-44d1-b54e-076eb6627a11", 00:28:31.302 "strip_size_kb": 0, 00:28:31.302 "state": "configuring", 00:28:31.302 "raid_level": "raid1", 00:28:31.302 "superblock": true, 00:28:31.302 "num_base_bdevs": 2, 00:28:31.302 "num_base_bdevs_discovered": 0, 00:28:31.302 "num_base_bdevs_operational": 2, 00:28:31.302 "base_bdevs_list": [ 00:28:31.302 { 00:28:31.302 "name": "BaseBdev1", 00:28:31.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:31.302 "is_configured": false, 00:28:31.302 "data_offset": 0, 00:28:31.302 "data_size": 0 00:28:31.302 }, 00:28:31.302 { 00:28:31.302 "name": "BaseBdev2", 00:28:31.302 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:31.302 "is_configured": false, 00:28:31.302 "data_offset": 0, 00:28:31.302 "data_size": 0 00:28:31.302 } 00:28:31.302 ] 00:28:31.302 }' 00:28:31.302 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.302 20:42:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:31.868 20:42:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:32.127 [2024-07-15 20:42:24.443185] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:32.127 [2024-07-15 20:42:24.443215] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13efa80 name Existed_Raid, state configuring 00:28:32.127 20:42:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:32.385 [2024-07-15 20:42:24.699880] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:32.385 [2024-07-15 20:42:24.699911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:32.385 [2024-07-15 20:42:24.699920] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:32.385 [2024-07-15 20:42:24.699942] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:32.385 20:42:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:28:32.644 [2024-07-15 20:42:24.894833] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:32.644 BaseBdev1 00:28:32.644 20:42:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:32.644 20:42:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:32.644 20:42:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:32.644 20:42:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:28:32.644 20:42:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:32.644 20:42:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:32.644 20:42:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:32.902 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:33.160 [ 00:28:33.160 { 00:28:33.160 "name": "BaseBdev1", 00:28:33.160 "aliases": [ 00:28:33.160 "955d5764-c52d-40d2-b58a-287b550d9255" 00:28:33.160 ], 00:28:33.160 "product_name": "Malloc disk", 00:28:33.160 "block_size": 4096, 00:28:33.160 "num_blocks": 8192, 00:28:33.160 "uuid": "955d5764-c52d-40d2-b58a-287b550d9255", 00:28:33.160 "md_size": 32, 00:28:33.160 "md_interleave": false, 00:28:33.160 "dif_type": 0, 00:28:33.160 "assigned_rate_limits": { 00:28:33.161 "rw_ios_per_sec": 0, 00:28:33.161 "rw_mbytes_per_sec": 0, 00:28:33.161 "r_mbytes_per_sec": 0, 00:28:33.161 "w_mbytes_per_sec": 0 00:28:33.161 }, 00:28:33.161 "claimed": true, 00:28:33.161 "claim_type": "exclusive_write", 00:28:33.161 "zoned": false, 00:28:33.161 "supported_io_types": { 00:28:33.161 "read": true, 00:28:33.161 "write": true, 00:28:33.161 "unmap": true, 00:28:33.161 "flush": true, 00:28:33.161 "reset": true, 00:28:33.161 "nvme_admin": false, 00:28:33.161 "nvme_io": false, 00:28:33.161 "nvme_io_md": false, 00:28:33.161 "write_zeroes": true, 00:28:33.161 "zcopy": true, 00:28:33.161 "get_zone_info": false, 00:28:33.161 "zone_management": false, 00:28:33.161 "zone_append": false, 00:28:33.161 "compare": false, 00:28:33.161 "compare_and_write": false, 00:28:33.161 "abort": true, 00:28:33.161 "seek_hole": false, 00:28:33.161 "seek_data": false, 00:28:33.161 "copy": true, 00:28:33.161 "nvme_iov_md": false 00:28:33.161 }, 00:28:33.161 "memory_domains": [ 00:28:33.161 { 00:28:33.161 "dma_device_id": "system", 00:28:33.161 "dma_device_type": 1 00:28:33.161 }, 00:28:33.161 { 00:28:33.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:33.161 "dma_device_type": 2 00:28:33.161 } 00:28:33.161 ], 00:28:33.161 "driver_specific": {} 00:28:33.161 } 00:28:33.161 ] 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:33.161 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:33.419 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:33.419 "name": "Existed_Raid", 00:28:33.419 "uuid": "0f6ace41-8954-4023-9563-22eca768c4c1", 00:28:33.419 "strip_size_kb": 0, 00:28:33.419 "state": "configuring", 00:28:33.419 "raid_level": "raid1", 00:28:33.419 "superblock": true, 00:28:33.419 "num_base_bdevs": 2, 00:28:33.419 "num_base_bdevs_discovered": 1, 00:28:33.419 "num_base_bdevs_operational": 2, 00:28:33.419 "base_bdevs_list": [ 00:28:33.419 { 00:28:33.419 "name": "BaseBdev1", 00:28:33.419 "uuid": "955d5764-c52d-40d2-b58a-287b550d9255", 00:28:33.419 "is_configured": true, 00:28:33.419 "data_offset": 256, 00:28:33.419 "data_size": 7936 00:28:33.419 }, 00:28:33.419 { 00:28:33.419 "name": "BaseBdev2", 00:28:33.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:33.419 "is_configured": false, 00:28:33.419 "data_offset": 0, 00:28:33.419 "data_size": 0 00:28:33.419 } 00:28:33.419 ] 00:28:33.419 }' 00:28:33.419 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:33.419 20:42:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:33.984 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:34.242 [2024-07-15 20:42:26.495105] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:34.242 [2024-07-15 20:42:26.495145] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13ef350 name Existed_Raid, state configuring 00:28:34.242 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:34.500 [2024-07-15 20:42:26.755825] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:34.500 [2024-07-15 20:42:26.757265] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:34.500 [2024-07-15 20:42:26.757299] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.501 20:42:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:34.759 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:34.759 "name": "Existed_Raid", 00:28:34.759 "uuid": "19913cd7-bb14-4a7f-afc6-1a4afd9c69ab", 00:28:34.759 "strip_size_kb": 0, 00:28:34.759 "state": "configuring", 00:28:34.759 "raid_level": "raid1", 00:28:34.759 "superblock": true, 00:28:34.759 "num_base_bdevs": 2, 00:28:34.759 "num_base_bdevs_discovered": 1, 00:28:34.759 "num_base_bdevs_operational": 2, 00:28:34.759 "base_bdevs_list": [ 00:28:34.759 { 00:28:34.759 "name": "BaseBdev1", 00:28:34.759 "uuid": "955d5764-c52d-40d2-b58a-287b550d9255", 00:28:34.759 "is_configured": true, 00:28:34.759 "data_offset": 256, 00:28:34.759 "data_size": 7936 00:28:34.759 }, 00:28:34.759 { 00:28:34.759 "name": "BaseBdev2", 00:28:34.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:34.759 "is_configured": false, 00:28:34.759 "data_offset": 0, 00:28:34.759 "data_size": 0 00:28:34.759 } 00:28:34.759 ] 00:28:34.759 }' 00:28:34.759 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:34.759 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:35.325 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:28:35.584 [2024-07-15 20:42:27.915072] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:35.584 [2024-07-15 20:42:27.915222] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13f1210 00:28:35.584 [2024-07-15 20:42:27.915236] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:35.584 [2024-07-15 20:42:27.915301] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13f0c50 00:28:35.584 [2024-07-15 20:42:27.915396] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13f1210 00:28:35.584 [2024-07-15 20:42:27.915406] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13f1210 00:28:35.584 [2024-07-15 20:42:27.915472] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:35.584 BaseBdev2 00:28:35.584 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:35.584 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:35.584 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:35.584 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:28:35.584 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:35.584 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:35.584 20:42:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:35.842 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:36.101 [ 00:28:36.101 { 00:28:36.101 "name": "BaseBdev2", 00:28:36.101 "aliases": [ 00:28:36.101 "a17a143a-5c5d-4533-81e8-01f1cd846497" 00:28:36.101 ], 00:28:36.101 "product_name": "Malloc disk", 00:28:36.101 "block_size": 4096, 00:28:36.101 "num_blocks": 8192, 00:28:36.101 "uuid": "a17a143a-5c5d-4533-81e8-01f1cd846497", 00:28:36.101 "md_size": 32, 00:28:36.101 "md_interleave": false, 00:28:36.101 "dif_type": 0, 00:28:36.101 "assigned_rate_limits": { 00:28:36.101 "rw_ios_per_sec": 0, 00:28:36.101 "rw_mbytes_per_sec": 0, 00:28:36.101 "r_mbytes_per_sec": 0, 00:28:36.101 "w_mbytes_per_sec": 0 00:28:36.101 }, 00:28:36.101 "claimed": true, 00:28:36.101 "claim_type": "exclusive_write", 00:28:36.101 "zoned": false, 00:28:36.101 "supported_io_types": { 00:28:36.101 "read": true, 00:28:36.101 "write": true, 00:28:36.101 "unmap": true, 00:28:36.101 "flush": true, 00:28:36.101 "reset": true, 00:28:36.101 "nvme_admin": false, 00:28:36.101 "nvme_io": false, 00:28:36.101 "nvme_io_md": false, 00:28:36.101 "write_zeroes": true, 00:28:36.101 "zcopy": true, 00:28:36.101 "get_zone_info": false, 00:28:36.101 "zone_management": false, 00:28:36.101 "zone_append": false, 00:28:36.101 "compare": false, 00:28:36.101 "compare_and_write": false, 00:28:36.101 "abort": true, 00:28:36.101 "seek_hole": false, 00:28:36.101 "seek_data": false, 00:28:36.101 "copy": true, 00:28:36.101 "nvme_iov_md": false 00:28:36.101 }, 00:28:36.101 "memory_domains": [ 00:28:36.101 { 00:28:36.101 "dma_device_id": "system", 00:28:36.101 "dma_device_type": 1 00:28:36.101 }, 00:28:36.101 { 00:28:36.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:36.101 "dma_device_type": 2 00:28:36.101 } 00:28:36.101 ], 00:28:36.101 "driver_specific": {} 00:28:36.101 } 00:28:36.101 ] 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.101 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:36.360 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:36.360 "name": "Existed_Raid", 00:28:36.360 "uuid": "19913cd7-bb14-4a7f-afc6-1a4afd9c69ab", 00:28:36.360 "strip_size_kb": 0, 00:28:36.360 "state": "online", 00:28:36.360 "raid_level": "raid1", 00:28:36.360 "superblock": true, 00:28:36.360 "num_base_bdevs": 2, 00:28:36.360 "num_base_bdevs_discovered": 2, 00:28:36.360 "num_base_bdevs_operational": 2, 00:28:36.360 "base_bdevs_list": [ 00:28:36.360 { 00:28:36.360 "name": "BaseBdev1", 00:28:36.360 "uuid": "955d5764-c52d-40d2-b58a-287b550d9255", 00:28:36.360 "is_configured": true, 00:28:36.360 "data_offset": 256, 00:28:36.360 "data_size": 7936 00:28:36.360 }, 00:28:36.360 { 00:28:36.360 "name": "BaseBdev2", 00:28:36.360 "uuid": "a17a143a-5c5d-4533-81e8-01f1cd846497", 00:28:36.360 "is_configured": true, 00:28:36.360 "data_offset": 256, 00:28:36.360 "data_size": 7936 00:28:36.360 } 00:28:36.360 ] 00:28:36.360 }' 00:28:36.360 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:36.360 20:42:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:37.296 [2024-07-15 20:42:29.539693] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:37.296 "name": "Existed_Raid", 00:28:37.296 "aliases": [ 00:28:37.296 "19913cd7-bb14-4a7f-afc6-1a4afd9c69ab" 00:28:37.296 ], 00:28:37.296 "product_name": "Raid Volume", 00:28:37.296 "block_size": 4096, 00:28:37.296 "num_blocks": 7936, 00:28:37.296 "uuid": "19913cd7-bb14-4a7f-afc6-1a4afd9c69ab", 00:28:37.296 "md_size": 32, 00:28:37.296 "md_interleave": false, 00:28:37.296 "dif_type": 0, 00:28:37.296 "assigned_rate_limits": { 00:28:37.296 "rw_ios_per_sec": 0, 00:28:37.296 "rw_mbytes_per_sec": 0, 00:28:37.296 "r_mbytes_per_sec": 0, 00:28:37.296 "w_mbytes_per_sec": 0 00:28:37.296 }, 00:28:37.296 "claimed": false, 00:28:37.296 "zoned": false, 00:28:37.296 "supported_io_types": { 00:28:37.296 "read": true, 00:28:37.296 "write": true, 00:28:37.296 "unmap": false, 00:28:37.296 "flush": false, 00:28:37.296 "reset": true, 00:28:37.296 "nvme_admin": false, 00:28:37.296 "nvme_io": false, 00:28:37.296 "nvme_io_md": false, 00:28:37.296 "write_zeroes": true, 00:28:37.296 "zcopy": false, 00:28:37.296 "get_zone_info": false, 00:28:37.296 "zone_management": false, 00:28:37.296 "zone_append": false, 00:28:37.296 "compare": false, 00:28:37.296 "compare_and_write": false, 00:28:37.296 "abort": false, 00:28:37.296 "seek_hole": false, 00:28:37.296 "seek_data": false, 00:28:37.296 "copy": false, 00:28:37.296 "nvme_iov_md": false 00:28:37.296 }, 00:28:37.296 "memory_domains": [ 00:28:37.296 { 00:28:37.296 "dma_device_id": "system", 00:28:37.296 "dma_device_type": 1 00:28:37.296 }, 00:28:37.296 { 00:28:37.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:37.296 "dma_device_type": 2 00:28:37.296 }, 00:28:37.296 { 00:28:37.296 "dma_device_id": "system", 00:28:37.296 "dma_device_type": 1 00:28:37.296 }, 00:28:37.296 { 00:28:37.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:37.296 "dma_device_type": 2 00:28:37.296 } 00:28:37.296 ], 00:28:37.296 "driver_specific": { 00:28:37.296 "raid": { 00:28:37.296 "uuid": "19913cd7-bb14-4a7f-afc6-1a4afd9c69ab", 00:28:37.296 "strip_size_kb": 0, 00:28:37.296 "state": "online", 00:28:37.296 "raid_level": "raid1", 00:28:37.296 "superblock": true, 00:28:37.296 "num_base_bdevs": 2, 00:28:37.296 "num_base_bdevs_discovered": 2, 00:28:37.296 "num_base_bdevs_operational": 2, 00:28:37.296 "base_bdevs_list": [ 00:28:37.296 { 00:28:37.296 "name": "BaseBdev1", 00:28:37.296 "uuid": "955d5764-c52d-40d2-b58a-287b550d9255", 00:28:37.296 "is_configured": true, 00:28:37.296 "data_offset": 256, 00:28:37.296 "data_size": 7936 00:28:37.296 }, 00:28:37.296 { 00:28:37.296 "name": "BaseBdev2", 00:28:37.296 "uuid": "a17a143a-5c5d-4533-81e8-01f1cd846497", 00:28:37.296 "is_configured": true, 00:28:37.296 "data_offset": 256, 00:28:37.296 "data_size": 7936 00:28:37.296 } 00:28:37.296 ] 00:28:37.296 } 00:28:37.296 } 00:28:37.296 }' 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:37.296 BaseBdev2' 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:37.296 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:37.555 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:37.555 "name": "BaseBdev1", 00:28:37.555 "aliases": [ 00:28:37.555 "955d5764-c52d-40d2-b58a-287b550d9255" 00:28:37.555 ], 00:28:37.555 "product_name": "Malloc disk", 00:28:37.555 "block_size": 4096, 00:28:37.555 "num_blocks": 8192, 00:28:37.555 "uuid": "955d5764-c52d-40d2-b58a-287b550d9255", 00:28:37.555 "md_size": 32, 00:28:37.555 "md_interleave": false, 00:28:37.555 "dif_type": 0, 00:28:37.555 "assigned_rate_limits": { 00:28:37.555 "rw_ios_per_sec": 0, 00:28:37.555 "rw_mbytes_per_sec": 0, 00:28:37.555 "r_mbytes_per_sec": 0, 00:28:37.555 "w_mbytes_per_sec": 0 00:28:37.555 }, 00:28:37.555 "claimed": true, 00:28:37.555 "claim_type": "exclusive_write", 00:28:37.555 "zoned": false, 00:28:37.555 "supported_io_types": { 00:28:37.555 "read": true, 00:28:37.555 "write": true, 00:28:37.555 "unmap": true, 00:28:37.555 "flush": true, 00:28:37.555 "reset": true, 00:28:37.555 "nvme_admin": false, 00:28:37.555 "nvme_io": false, 00:28:37.555 "nvme_io_md": false, 00:28:37.555 "write_zeroes": true, 00:28:37.555 "zcopy": true, 00:28:37.555 "get_zone_info": false, 00:28:37.555 "zone_management": false, 00:28:37.555 "zone_append": false, 00:28:37.555 "compare": false, 00:28:37.555 "compare_and_write": false, 00:28:37.555 "abort": true, 00:28:37.555 "seek_hole": false, 00:28:37.555 "seek_data": false, 00:28:37.555 "copy": true, 00:28:37.555 "nvme_iov_md": false 00:28:37.555 }, 00:28:37.555 "memory_domains": [ 00:28:37.555 { 00:28:37.555 "dma_device_id": "system", 00:28:37.555 "dma_device_type": 1 00:28:37.555 }, 00:28:37.555 { 00:28:37.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:37.555 "dma_device_type": 2 00:28:37.555 } 00:28:37.555 ], 00:28:37.555 "driver_specific": {} 00:28:37.555 }' 00:28:37.555 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:37.813 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:37.813 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:37.813 20:42:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:37.813 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:37.813 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:37.813 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:37.813 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:37.813 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:37.813 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:38.071 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:38.071 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:38.071 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:38.071 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:38.071 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:38.328 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:38.328 "name": "BaseBdev2", 00:28:38.328 "aliases": [ 00:28:38.328 "a17a143a-5c5d-4533-81e8-01f1cd846497" 00:28:38.328 ], 00:28:38.328 "product_name": "Malloc disk", 00:28:38.328 "block_size": 4096, 00:28:38.328 "num_blocks": 8192, 00:28:38.328 "uuid": "a17a143a-5c5d-4533-81e8-01f1cd846497", 00:28:38.328 "md_size": 32, 00:28:38.328 "md_interleave": false, 00:28:38.328 "dif_type": 0, 00:28:38.328 "assigned_rate_limits": { 00:28:38.328 "rw_ios_per_sec": 0, 00:28:38.328 "rw_mbytes_per_sec": 0, 00:28:38.328 "r_mbytes_per_sec": 0, 00:28:38.328 "w_mbytes_per_sec": 0 00:28:38.328 }, 00:28:38.328 "claimed": true, 00:28:38.328 "claim_type": "exclusive_write", 00:28:38.328 "zoned": false, 00:28:38.328 "supported_io_types": { 00:28:38.328 "read": true, 00:28:38.328 "write": true, 00:28:38.328 "unmap": true, 00:28:38.328 "flush": true, 00:28:38.328 "reset": true, 00:28:38.328 "nvme_admin": false, 00:28:38.328 "nvme_io": false, 00:28:38.328 "nvme_io_md": false, 00:28:38.329 "write_zeroes": true, 00:28:38.329 "zcopy": true, 00:28:38.329 "get_zone_info": false, 00:28:38.329 "zone_management": false, 00:28:38.329 "zone_append": false, 00:28:38.329 "compare": false, 00:28:38.329 "compare_and_write": false, 00:28:38.329 "abort": true, 00:28:38.329 "seek_hole": false, 00:28:38.329 "seek_data": false, 00:28:38.329 "copy": true, 00:28:38.329 "nvme_iov_md": false 00:28:38.329 }, 00:28:38.329 "memory_domains": [ 00:28:38.329 { 00:28:38.329 "dma_device_id": "system", 00:28:38.329 "dma_device_type": 1 00:28:38.329 }, 00:28:38.329 { 00:28:38.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:38.329 "dma_device_type": 2 00:28:38.329 } 00:28:38.329 ], 00:28:38.329 "driver_specific": {} 00:28:38.329 }' 00:28:38.329 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:38.329 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:38.329 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:38.329 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:38.329 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:38.329 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:38.329 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:38.587 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:38.587 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:38.587 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:38.587 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:38.587 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:38.587 20:42:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:38.845 [2024-07-15 20:42:31.079581] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:38.845 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:38.845 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:38.845 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:38.845 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:38.845 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:38.845 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:38.845 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:38.846 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:38.846 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:38.846 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:38.846 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:38.846 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:38.846 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:38.846 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:38.846 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:38.846 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.846 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:39.104 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:39.104 "name": "Existed_Raid", 00:28:39.104 "uuid": "19913cd7-bb14-4a7f-afc6-1a4afd9c69ab", 00:28:39.104 "strip_size_kb": 0, 00:28:39.104 "state": "online", 00:28:39.104 "raid_level": "raid1", 00:28:39.104 "superblock": true, 00:28:39.104 "num_base_bdevs": 2, 00:28:39.104 "num_base_bdevs_discovered": 1, 00:28:39.104 "num_base_bdevs_operational": 1, 00:28:39.104 "base_bdevs_list": [ 00:28:39.104 { 00:28:39.104 "name": null, 00:28:39.104 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:39.104 "is_configured": false, 00:28:39.104 "data_offset": 256, 00:28:39.104 "data_size": 7936 00:28:39.104 }, 00:28:39.104 { 00:28:39.104 "name": "BaseBdev2", 00:28:39.104 "uuid": "a17a143a-5c5d-4533-81e8-01f1cd846497", 00:28:39.104 "is_configured": true, 00:28:39.104 "data_offset": 256, 00:28:39.104 "data_size": 7936 00:28:39.104 } 00:28:39.104 ] 00:28:39.104 }' 00:28:39.104 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:39.104 20:42:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:39.706 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:39.706 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:39.706 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:39.706 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:39.963 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:39.963 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:39.963 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:40.221 [2024-07-15 20:42:32.482115] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:40.221 [2024-07-15 20:42:32.482214] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:40.221 [2024-07-15 20:42:32.495689] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:40.221 [2024-07-15 20:42:32.495729] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:40.221 [2024-07-15 20:42:32.495741] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13f1210 name Existed_Raid, state offline 00:28:40.221 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:40.221 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:40.221 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.221 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1501460 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1501460 ']' 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 1501460 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1501460 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1501460' 00:28:40.478 killing process with pid 1501460 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 1501460 00:28:40.478 [2024-07-15 20:42:32.815098] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:40.478 20:42:32 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 1501460 00:28:40.478 [2024-07-15 20:42:32.816087] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:40.734 20:42:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:28:40.734 00:28:40.734 real 0m10.892s 00:28:40.734 user 0m19.351s 00:28:40.734 sys 0m2.043s 00:28:40.734 20:42:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:40.734 20:42:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:40.734 ************************************ 00:28:40.734 END TEST raid_state_function_test_sb_md_separate 00:28:40.734 ************************************ 00:28:40.734 20:42:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:40.734 20:42:33 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:28:40.734 20:42:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:40.734 20:42:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:40.734 20:42:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:40.991 ************************************ 00:28:40.991 START TEST raid_superblock_test_md_separate 00:28:40.991 ************************************ 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=1503082 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 1503082 /var/tmp/spdk-raid.sock 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1503082 ']' 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:40.991 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:40.991 20:42:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:40.991 [2024-07-15 20:42:33.191972] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:28:40.991 [2024-07-15 20:42:33.192040] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1503082 ] 00:28:40.991 [2024-07-15 20:42:33.320572] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:41.263 [2024-07-15 20:42:33.425911] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:41.263 [2024-07-15 20:42:33.485294] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:41.263 [2024-07-15 20:42:33.485331] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:41.827 20:42:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:41.827 20:42:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:41.827 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:41.827 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:41.827 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:41.827 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:41.827 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:41.827 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:41.827 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:41.827 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:41.828 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:28:42.085 malloc1 00:28:42.085 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:42.344 [2024-07-15 20:42:34.597029] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:42.344 [2024-07-15 20:42:34.597076] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:42.344 [2024-07-15 20:42:34.597099] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26d0830 00:28:42.344 [2024-07-15 20:42:34.597111] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:42.344 [2024-07-15 20:42:34.598691] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:42.344 [2024-07-15 20:42:34.598718] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:42.344 pt1 00:28:42.344 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:42.344 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:42.344 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:42.344 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:42.344 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:42.344 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:42.344 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:42.344 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:42.344 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:28:42.602 malloc2 00:28:42.602 20:42:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:42.860 [2024-07-15 20:42:35.093253] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:42.860 [2024-07-15 20:42:35.093296] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:42.860 [2024-07-15 20:42:35.093317] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26c2250 00:28:42.860 [2024-07-15 20:42:35.093336] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:42.860 [2024-07-15 20:42:35.094772] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:42.860 [2024-07-15 20:42:35.094800] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:42.860 pt2 00:28:42.860 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:42.860 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:42.860 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:43.119 [2024-07-15 20:42:35.333914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:43.119 [2024-07-15 20:42:35.335358] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:43.119 [2024-07-15 20:42:35.335513] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c2d20 00:28:43.119 [2024-07-15 20:42:35.335527] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:43.119 [2024-07-15 20:42:35.335604] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26b6a60 00:28:43.119 [2024-07-15 20:42:35.335721] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c2d20 00:28:43.119 [2024-07-15 20:42:35.335731] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26c2d20 00:28:43.119 [2024-07-15 20:42:35.335803] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.119 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.377 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.377 "name": "raid_bdev1", 00:28:43.377 "uuid": "0dc6ffdc-5b03-4ba9-881a-35f42641e99d", 00:28:43.377 "strip_size_kb": 0, 00:28:43.377 "state": "online", 00:28:43.377 "raid_level": "raid1", 00:28:43.377 "superblock": true, 00:28:43.377 "num_base_bdevs": 2, 00:28:43.377 "num_base_bdevs_discovered": 2, 00:28:43.377 "num_base_bdevs_operational": 2, 00:28:43.377 "base_bdevs_list": [ 00:28:43.377 { 00:28:43.377 "name": "pt1", 00:28:43.377 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:43.377 "is_configured": true, 00:28:43.377 "data_offset": 256, 00:28:43.377 "data_size": 7936 00:28:43.377 }, 00:28:43.377 { 00:28:43.377 "name": "pt2", 00:28:43.377 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:43.377 "is_configured": true, 00:28:43.377 "data_offset": 256, 00:28:43.377 "data_size": 7936 00:28:43.377 } 00:28:43.377 ] 00:28:43.377 }' 00:28:43.377 20:42:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.377 20:42:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:43.944 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:43.944 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:43.944 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:43.944 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:43.944 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:43.944 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:43.944 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:43.944 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:44.202 [2024-07-15 20:42:36.425073] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:44.203 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:44.203 "name": "raid_bdev1", 00:28:44.203 "aliases": [ 00:28:44.203 "0dc6ffdc-5b03-4ba9-881a-35f42641e99d" 00:28:44.203 ], 00:28:44.203 "product_name": "Raid Volume", 00:28:44.203 "block_size": 4096, 00:28:44.203 "num_blocks": 7936, 00:28:44.203 "uuid": "0dc6ffdc-5b03-4ba9-881a-35f42641e99d", 00:28:44.203 "md_size": 32, 00:28:44.203 "md_interleave": false, 00:28:44.203 "dif_type": 0, 00:28:44.203 "assigned_rate_limits": { 00:28:44.203 "rw_ios_per_sec": 0, 00:28:44.203 "rw_mbytes_per_sec": 0, 00:28:44.203 "r_mbytes_per_sec": 0, 00:28:44.203 "w_mbytes_per_sec": 0 00:28:44.203 }, 00:28:44.203 "claimed": false, 00:28:44.203 "zoned": false, 00:28:44.203 "supported_io_types": { 00:28:44.203 "read": true, 00:28:44.203 "write": true, 00:28:44.203 "unmap": false, 00:28:44.203 "flush": false, 00:28:44.203 "reset": true, 00:28:44.203 "nvme_admin": false, 00:28:44.203 "nvme_io": false, 00:28:44.203 "nvme_io_md": false, 00:28:44.203 "write_zeroes": true, 00:28:44.203 "zcopy": false, 00:28:44.203 "get_zone_info": false, 00:28:44.203 "zone_management": false, 00:28:44.203 "zone_append": false, 00:28:44.203 "compare": false, 00:28:44.203 "compare_and_write": false, 00:28:44.203 "abort": false, 00:28:44.203 "seek_hole": false, 00:28:44.203 "seek_data": false, 00:28:44.203 "copy": false, 00:28:44.203 "nvme_iov_md": false 00:28:44.203 }, 00:28:44.203 "memory_domains": [ 00:28:44.203 { 00:28:44.203 "dma_device_id": "system", 00:28:44.203 "dma_device_type": 1 00:28:44.203 }, 00:28:44.203 { 00:28:44.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:44.203 "dma_device_type": 2 00:28:44.203 }, 00:28:44.203 { 00:28:44.203 "dma_device_id": "system", 00:28:44.203 "dma_device_type": 1 00:28:44.203 }, 00:28:44.203 { 00:28:44.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:44.203 "dma_device_type": 2 00:28:44.203 } 00:28:44.203 ], 00:28:44.203 "driver_specific": { 00:28:44.203 "raid": { 00:28:44.203 "uuid": "0dc6ffdc-5b03-4ba9-881a-35f42641e99d", 00:28:44.203 "strip_size_kb": 0, 00:28:44.203 "state": "online", 00:28:44.203 "raid_level": "raid1", 00:28:44.203 "superblock": true, 00:28:44.203 "num_base_bdevs": 2, 00:28:44.203 "num_base_bdevs_discovered": 2, 00:28:44.203 "num_base_bdevs_operational": 2, 00:28:44.203 "base_bdevs_list": [ 00:28:44.203 { 00:28:44.203 "name": "pt1", 00:28:44.203 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:44.203 "is_configured": true, 00:28:44.203 "data_offset": 256, 00:28:44.203 "data_size": 7936 00:28:44.203 }, 00:28:44.203 { 00:28:44.203 "name": "pt2", 00:28:44.203 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:44.203 "is_configured": true, 00:28:44.203 "data_offset": 256, 00:28:44.203 "data_size": 7936 00:28:44.203 } 00:28:44.203 ] 00:28:44.203 } 00:28:44.203 } 00:28:44.203 }' 00:28:44.203 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:44.203 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:44.203 pt2' 00:28:44.203 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:44.203 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:44.203 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:44.461 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:44.462 "name": "pt1", 00:28:44.462 "aliases": [ 00:28:44.462 "00000000-0000-0000-0000-000000000001" 00:28:44.462 ], 00:28:44.462 "product_name": "passthru", 00:28:44.462 "block_size": 4096, 00:28:44.462 "num_blocks": 8192, 00:28:44.462 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:44.462 "md_size": 32, 00:28:44.462 "md_interleave": false, 00:28:44.462 "dif_type": 0, 00:28:44.462 "assigned_rate_limits": { 00:28:44.462 "rw_ios_per_sec": 0, 00:28:44.462 "rw_mbytes_per_sec": 0, 00:28:44.462 "r_mbytes_per_sec": 0, 00:28:44.462 "w_mbytes_per_sec": 0 00:28:44.462 }, 00:28:44.462 "claimed": true, 00:28:44.462 "claim_type": "exclusive_write", 00:28:44.462 "zoned": false, 00:28:44.462 "supported_io_types": { 00:28:44.462 "read": true, 00:28:44.462 "write": true, 00:28:44.462 "unmap": true, 00:28:44.462 "flush": true, 00:28:44.462 "reset": true, 00:28:44.462 "nvme_admin": false, 00:28:44.462 "nvme_io": false, 00:28:44.462 "nvme_io_md": false, 00:28:44.462 "write_zeroes": true, 00:28:44.462 "zcopy": true, 00:28:44.462 "get_zone_info": false, 00:28:44.462 "zone_management": false, 00:28:44.462 "zone_append": false, 00:28:44.462 "compare": false, 00:28:44.462 "compare_and_write": false, 00:28:44.462 "abort": true, 00:28:44.462 "seek_hole": false, 00:28:44.462 "seek_data": false, 00:28:44.462 "copy": true, 00:28:44.462 "nvme_iov_md": false 00:28:44.462 }, 00:28:44.462 "memory_domains": [ 00:28:44.462 { 00:28:44.462 "dma_device_id": "system", 00:28:44.462 "dma_device_type": 1 00:28:44.462 }, 00:28:44.462 { 00:28:44.462 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:44.462 "dma_device_type": 2 00:28:44.462 } 00:28:44.462 ], 00:28:44.462 "driver_specific": { 00:28:44.462 "passthru": { 00:28:44.462 "name": "pt1", 00:28:44.462 "base_bdev_name": "malloc1" 00:28:44.462 } 00:28:44.462 } 00:28:44.462 }' 00:28:44.462 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:44.462 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:44.462 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:44.462 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:44.719 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:44.719 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:44.719 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:44.719 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:44.719 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:44.719 20:42:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:44.719 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:44.719 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:44.719 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:44.719 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:44.719 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:44.977 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:44.977 "name": "pt2", 00:28:44.977 "aliases": [ 00:28:44.977 "00000000-0000-0000-0000-000000000002" 00:28:44.977 ], 00:28:44.977 "product_name": "passthru", 00:28:44.977 "block_size": 4096, 00:28:44.977 "num_blocks": 8192, 00:28:44.977 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:44.977 "md_size": 32, 00:28:44.977 "md_interleave": false, 00:28:44.977 "dif_type": 0, 00:28:44.977 "assigned_rate_limits": { 00:28:44.977 "rw_ios_per_sec": 0, 00:28:44.977 "rw_mbytes_per_sec": 0, 00:28:44.977 "r_mbytes_per_sec": 0, 00:28:44.977 "w_mbytes_per_sec": 0 00:28:44.977 }, 00:28:44.977 "claimed": true, 00:28:44.977 "claim_type": "exclusive_write", 00:28:44.977 "zoned": false, 00:28:44.977 "supported_io_types": { 00:28:44.977 "read": true, 00:28:44.977 "write": true, 00:28:44.977 "unmap": true, 00:28:44.977 "flush": true, 00:28:44.977 "reset": true, 00:28:44.977 "nvme_admin": false, 00:28:44.977 "nvme_io": false, 00:28:44.977 "nvme_io_md": false, 00:28:44.977 "write_zeroes": true, 00:28:44.977 "zcopy": true, 00:28:44.977 "get_zone_info": false, 00:28:44.977 "zone_management": false, 00:28:44.977 "zone_append": false, 00:28:44.977 "compare": false, 00:28:44.977 "compare_and_write": false, 00:28:44.977 "abort": true, 00:28:44.977 "seek_hole": false, 00:28:44.977 "seek_data": false, 00:28:44.977 "copy": true, 00:28:44.977 "nvme_iov_md": false 00:28:44.977 }, 00:28:44.977 "memory_domains": [ 00:28:44.977 { 00:28:44.977 "dma_device_id": "system", 00:28:44.977 "dma_device_type": 1 00:28:44.977 }, 00:28:44.977 { 00:28:44.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:44.977 "dma_device_type": 2 00:28:44.977 } 00:28:44.977 ], 00:28:44.977 "driver_specific": { 00:28:44.977 "passthru": { 00:28:44.977 "name": "pt2", 00:28:44.977 "base_bdev_name": "malloc2" 00:28:44.977 } 00:28:44.977 } 00:28:44.977 }' 00:28:44.977 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:45.235 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:45.235 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:45.235 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:45.235 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:45.235 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:45.235 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:45.235 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:45.235 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:45.235 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:45.493 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:45.493 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:45.493 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:45.493 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:45.751 [2024-07-15 20:42:37.912997] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:45.751 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0dc6ffdc-5b03-4ba9-881a-35f42641e99d 00:28:45.751 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 0dc6ffdc-5b03-4ba9-881a-35f42641e99d ']' 00:28:45.751 20:42:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:46.010 [2024-07-15 20:42:38.165404] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:46.010 [2024-07-15 20:42:38.165420] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:46.010 [2024-07-15 20:42:38.165471] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:46.010 [2024-07-15 20:42:38.165524] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:46.010 [2024-07-15 20:42:38.165535] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c2d20 name raid_bdev1, state offline 00:28:46.010 20:42:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.010 20:42:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:46.268 20:42:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:46.268 20:42:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:46.268 20:42:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:46.268 20:42:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:46.527 20:42:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:46.527 20:42:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:46.785 20:42:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:46.785 20:42:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:47.044 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:47.044 [2024-07-15 20:42:39.404639] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:47.044 [2024-07-15 20:42:39.406042] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:47.044 [2024-07-15 20:42:39.406097] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:47.044 [2024-07-15 20:42:39.406138] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:47.044 [2024-07-15 20:42:39.406157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:47.044 [2024-07-15 20:42:39.406167] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2532ed0 name raid_bdev1, state configuring 00:28:47.044 request: 00:28:47.044 { 00:28:47.044 "name": "raid_bdev1", 00:28:47.044 "raid_level": "raid1", 00:28:47.044 "base_bdevs": [ 00:28:47.044 "malloc1", 00:28:47.044 "malloc2" 00:28:47.044 ], 00:28:47.044 "superblock": false, 00:28:47.044 "method": "bdev_raid_create", 00:28:47.044 "req_id": 1 00:28:47.044 } 00:28:47.044 Got JSON-RPC error response 00:28:47.044 response: 00:28:47.044 { 00:28:47.044 "code": -17, 00:28:47.044 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:47.044 } 00:28:47.303 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:47.303 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:47.303 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:47.303 20:42:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:47.303 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.303 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:47.303 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:47.303 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:47.303 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:47.562 [2024-07-15 20:42:39.885844] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:47.562 [2024-07-15 20:42:39.885885] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:47.562 [2024-07-15 20:42:39.885903] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26d0ee0 00:28:47.562 [2024-07-15 20:42:39.885915] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:47.562 [2024-07-15 20:42:39.887337] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:47.562 [2024-07-15 20:42:39.887364] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:47.562 [2024-07-15 20:42:39.887410] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:47.562 [2024-07-15 20:42:39.887434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:47.562 pt1 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.562 20:42:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.819 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:47.819 "name": "raid_bdev1", 00:28:47.819 "uuid": "0dc6ffdc-5b03-4ba9-881a-35f42641e99d", 00:28:47.819 "strip_size_kb": 0, 00:28:47.819 "state": "configuring", 00:28:47.819 "raid_level": "raid1", 00:28:47.819 "superblock": true, 00:28:47.819 "num_base_bdevs": 2, 00:28:47.819 "num_base_bdevs_discovered": 1, 00:28:47.819 "num_base_bdevs_operational": 2, 00:28:47.819 "base_bdevs_list": [ 00:28:47.819 { 00:28:47.819 "name": "pt1", 00:28:47.819 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:47.819 "is_configured": true, 00:28:47.819 "data_offset": 256, 00:28:47.819 "data_size": 7936 00:28:47.819 }, 00:28:47.819 { 00:28:47.819 "name": null, 00:28:47.819 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:47.819 "is_configured": false, 00:28:47.819 "data_offset": 256, 00:28:47.819 "data_size": 7936 00:28:47.819 } 00:28:47.819 ] 00:28:47.819 }' 00:28:47.819 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:47.819 20:42:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:48.386 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:48.386 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:48.386 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:48.386 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:48.644 [2024-07-15 20:42:40.968716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:48.644 [2024-07-15 20:42:40.968767] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:48.644 [2024-07-15 20:42:40.968793] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2533490 00:28:48.645 [2024-07-15 20:42:40.968806] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:48.645 [2024-07-15 20:42:40.969004] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:48.645 [2024-07-15 20:42:40.969021] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:48.645 [2024-07-15 20:42:40.969066] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:48.645 [2024-07-15 20:42:40.969084] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:48.645 [2024-07-15 20:42:40.969173] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26b75d0 00:28:48.645 [2024-07-15 20:42:40.969183] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:48.645 [2024-07-15 20:42:40.969239] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26b8800 00:28:48.645 [2024-07-15 20:42:40.969338] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26b75d0 00:28:48.645 [2024-07-15 20:42:40.969347] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26b75d0 00:28:48.645 [2024-07-15 20:42:40.969415] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:48.645 pt2 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.645 20:42:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:48.903 20:42:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:48.903 "name": "raid_bdev1", 00:28:48.903 "uuid": "0dc6ffdc-5b03-4ba9-881a-35f42641e99d", 00:28:48.903 "strip_size_kb": 0, 00:28:48.903 "state": "online", 00:28:48.903 "raid_level": "raid1", 00:28:48.903 "superblock": true, 00:28:48.903 "num_base_bdevs": 2, 00:28:48.903 "num_base_bdevs_discovered": 2, 00:28:48.903 "num_base_bdevs_operational": 2, 00:28:48.903 "base_bdevs_list": [ 00:28:48.903 { 00:28:48.903 "name": "pt1", 00:28:48.903 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:48.903 "is_configured": true, 00:28:48.903 "data_offset": 256, 00:28:48.903 "data_size": 7936 00:28:48.903 }, 00:28:48.903 { 00:28:48.903 "name": "pt2", 00:28:48.903 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:48.903 "is_configured": true, 00:28:48.903 "data_offset": 256, 00:28:48.903 "data_size": 7936 00:28:48.903 } 00:28:48.903 ] 00:28:48.903 }' 00:28:48.903 20:42:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:48.903 20:42:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:49.472 20:42:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:49.472 20:42:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:49.472 20:42:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:49.472 20:42:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:49.472 20:42:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:49.472 20:42:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:49.740 20:42:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:49.740 20:42:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:49.740 [2024-07-15 20:42:42.075911] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:49.740 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:49.740 "name": "raid_bdev1", 00:28:49.740 "aliases": [ 00:28:49.740 "0dc6ffdc-5b03-4ba9-881a-35f42641e99d" 00:28:49.740 ], 00:28:49.740 "product_name": "Raid Volume", 00:28:49.740 "block_size": 4096, 00:28:49.740 "num_blocks": 7936, 00:28:49.740 "uuid": "0dc6ffdc-5b03-4ba9-881a-35f42641e99d", 00:28:49.740 "md_size": 32, 00:28:49.740 "md_interleave": false, 00:28:49.740 "dif_type": 0, 00:28:49.740 "assigned_rate_limits": { 00:28:49.740 "rw_ios_per_sec": 0, 00:28:49.740 "rw_mbytes_per_sec": 0, 00:28:49.740 "r_mbytes_per_sec": 0, 00:28:49.741 "w_mbytes_per_sec": 0 00:28:49.741 }, 00:28:49.741 "claimed": false, 00:28:49.741 "zoned": false, 00:28:49.741 "supported_io_types": { 00:28:49.741 "read": true, 00:28:49.741 "write": true, 00:28:49.741 "unmap": false, 00:28:49.741 "flush": false, 00:28:49.741 "reset": true, 00:28:49.741 "nvme_admin": false, 00:28:49.741 "nvme_io": false, 00:28:49.741 "nvme_io_md": false, 00:28:49.741 "write_zeroes": true, 00:28:49.741 "zcopy": false, 00:28:49.741 "get_zone_info": false, 00:28:49.741 "zone_management": false, 00:28:49.741 "zone_append": false, 00:28:49.741 "compare": false, 00:28:49.741 "compare_and_write": false, 00:28:49.741 "abort": false, 00:28:49.741 "seek_hole": false, 00:28:49.741 "seek_data": false, 00:28:49.741 "copy": false, 00:28:49.741 "nvme_iov_md": false 00:28:49.741 }, 00:28:49.741 "memory_domains": [ 00:28:49.741 { 00:28:49.741 "dma_device_id": "system", 00:28:49.741 "dma_device_type": 1 00:28:49.741 }, 00:28:49.741 { 00:28:49.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:49.741 "dma_device_type": 2 00:28:49.741 }, 00:28:49.741 { 00:28:49.741 "dma_device_id": "system", 00:28:49.741 "dma_device_type": 1 00:28:49.741 }, 00:28:49.741 { 00:28:49.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:49.741 "dma_device_type": 2 00:28:49.741 } 00:28:49.741 ], 00:28:49.741 "driver_specific": { 00:28:49.741 "raid": { 00:28:49.741 "uuid": "0dc6ffdc-5b03-4ba9-881a-35f42641e99d", 00:28:49.741 "strip_size_kb": 0, 00:28:49.741 "state": "online", 00:28:49.741 "raid_level": "raid1", 00:28:49.741 "superblock": true, 00:28:49.741 "num_base_bdevs": 2, 00:28:49.741 "num_base_bdevs_discovered": 2, 00:28:49.741 "num_base_bdevs_operational": 2, 00:28:49.741 "base_bdevs_list": [ 00:28:49.741 { 00:28:49.741 "name": "pt1", 00:28:49.741 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:49.741 "is_configured": true, 00:28:49.741 "data_offset": 256, 00:28:49.741 "data_size": 7936 00:28:49.741 }, 00:28:49.741 { 00:28:49.741 "name": "pt2", 00:28:49.741 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:49.741 "is_configured": true, 00:28:49.741 "data_offset": 256, 00:28:49.741 "data_size": 7936 00:28:49.741 } 00:28:49.741 ] 00:28:49.741 } 00:28:49.741 } 00:28:49.741 }' 00:28:49.741 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:50.000 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:50.000 pt2' 00:28:50.000 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:50.000 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:50.000 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:50.568 "name": "pt1", 00:28:50.568 "aliases": [ 00:28:50.568 "00000000-0000-0000-0000-000000000001" 00:28:50.568 ], 00:28:50.568 "product_name": "passthru", 00:28:50.568 "block_size": 4096, 00:28:50.568 "num_blocks": 8192, 00:28:50.568 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:50.568 "md_size": 32, 00:28:50.568 "md_interleave": false, 00:28:50.568 "dif_type": 0, 00:28:50.568 "assigned_rate_limits": { 00:28:50.568 "rw_ios_per_sec": 0, 00:28:50.568 "rw_mbytes_per_sec": 0, 00:28:50.568 "r_mbytes_per_sec": 0, 00:28:50.568 "w_mbytes_per_sec": 0 00:28:50.568 }, 00:28:50.568 "claimed": true, 00:28:50.568 "claim_type": "exclusive_write", 00:28:50.568 "zoned": false, 00:28:50.568 "supported_io_types": { 00:28:50.568 "read": true, 00:28:50.568 "write": true, 00:28:50.568 "unmap": true, 00:28:50.568 "flush": true, 00:28:50.568 "reset": true, 00:28:50.568 "nvme_admin": false, 00:28:50.568 "nvme_io": false, 00:28:50.568 "nvme_io_md": false, 00:28:50.568 "write_zeroes": true, 00:28:50.568 "zcopy": true, 00:28:50.568 "get_zone_info": false, 00:28:50.568 "zone_management": false, 00:28:50.568 "zone_append": false, 00:28:50.568 "compare": false, 00:28:50.568 "compare_and_write": false, 00:28:50.568 "abort": true, 00:28:50.568 "seek_hole": false, 00:28:50.568 "seek_data": false, 00:28:50.568 "copy": true, 00:28:50.568 "nvme_iov_md": false 00:28:50.568 }, 00:28:50.568 "memory_domains": [ 00:28:50.568 { 00:28:50.568 "dma_device_id": "system", 00:28:50.568 "dma_device_type": 1 00:28:50.568 }, 00:28:50.568 { 00:28:50.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:50.568 "dma_device_type": 2 00:28:50.568 } 00:28:50.568 ], 00:28:50.568 "driver_specific": { 00:28:50.568 "passthru": { 00:28:50.568 "name": "pt1", 00:28:50.568 "base_bdev_name": "malloc1" 00:28:50.568 } 00:28:50.568 } 00:28:50.568 }' 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:50.568 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:50.826 20:42:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:50.826 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:50.826 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:50.826 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:50.826 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:51.087 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:51.087 "name": "pt2", 00:28:51.087 "aliases": [ 00:28:51.087 "00000000-0000-0000-0000-000000000002" 00:28:51.087 ], 00:28:51.087 "product_name": "passthru", 00:28:51.087 "block_size": 4096, 00:28:51.087 "num_blocks": 8192, 00:28:51.087 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:51.087 "md_size": 32, 00:28:51.087 "md_interleave": false, 00:28:51.087 "dif_type": 0, 00:28:51.087 "assigned_rate_limits": { 00:28:51.087 "rw_ios_per_sec": 0, 00:28:51.087 "rw_mbytes_per_sec": 0, 00:28:51.087 "r_mbytes_per_sec": 0, 00:28:51.087 "w_mbytes_per_sec": 0 00:28:51.087 }, 00:28:51.087 "claimed": true, 00:28:51.087 "claim_type": "exclusive_write", 00:28:51.087 "zoned": false, 00:28:51.087 "supported_io_types": { 00:28:51.087 "read": true, 00:28:51.087 "write": true, 00:28:51.087 "unmap": true, 00:28:51.087 "flush": true, 00:28:51.087 "reset": true, 00:28:51.087 "nvme_admin": false, 00:28:51.087 "nvme_io": false, 00:28:51.087 "nvme_io_md": false, 00:28:51.087 "write_zeroes": true, 00:28:51.087 "zcopy": true, 00:28:51.087 "get_zone_info": false, 00:28:51.087 "zone_management": false, 00:28:51.087 "zone_append": false, 00:28:51.087 "compare": false, 00:28:51.087 "compare_and_write": false, 00:28:51.087 "abort": true, 00:28:51.087 "seek_hole": false, 00:28:51.087 "seek_data": false, 00:28:51.087 "copy": true, 00:28:51.087 "nvme_iov_md": false 00:28:51.087 }, 00:28:51.087 "memory_domains": [ 00:28:51.087 { 00:28:51.087 "dma_device_id": "system", 00:28:51.087 "dma_device_type": 1 00:28:51.087 }, 00:28:51.087 { 00:28:51.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:51.087 "dma_device_type": 2 00:28:51.087 } 00:28:51.087 ], 00:28:51.087 "driver_specific": { 00:28:51.087 "passthru": { 00:28:51.087 "name": "pt2", 00:28:51.087 "base_bdev_name": "malloc2" 00:28:51.087 } 00:28:51.087 } 00:28:51.087 }' 00:28:51.087 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:51.087 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:51.087 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:51.087 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:51.087 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:51.347 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:51.347 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:51.347 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:51.347 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:51.347 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:51.347 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:51.347 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:51.347 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:51.347 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:51.605 [2024-07-15 20:42:43.892717] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:51.605 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 0dc6ffdc-5b03-4ba9-881a-35f42641e99d '!=' 0dc6ffdc-5b03-4ba9-881a-35f42641e99d ']' 00:28:51.605 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:51.605 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:51.605 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:51.605 20:42:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:51.863 [2024-07-15 20:42:44.137148] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.863 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:52.122 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:52.122 "name": "raid_bdev1", 00:28:52.122 "uuid": "0dc6ffdc-5b03-4ba9-881a-35f42641e99d", 00:28:52.122 "strip_size_kb": 0, 00:28:52.122 "state": "online", 00:28:52.122 "raid_level": "raid1", 00:28:52.122 "superblock": true, 00:28:52.122 "num_base_bdevs": 2, 00:28:52.122 "num_base_bdevs_discovered": 1, 00:28:52.122 "num_base_bdevs_operational": 1, 00:28:52.122 "base_bdevs_list": [ 00:28:52.122 { 00:28:52.122 "name": null, 00:28:52.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:52.122 "is_configured": false, 00:28:52.122 "data_offset": 256, 00:28:52.122 "data_size": 7936 00:28:52.122 }, 00:28:52.122 { 00:28:52.122 "name": "pt2", 00:28:52.122 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:52.122 "is_configured": true, 00:28:52.122 "data_offset": 256, 00:28:52.122 "data_size": 7936 00:28:52.122 } 00:28:52.122 ] 00:28:52.122 }' 00:28:52.122 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:52.122 20:42:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:52.689 20:42:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:52.947 [2024-07-15 20:42:45.203949] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:52.947 [2024-07-15 20:42:45.203975] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:52.947 [2024-07-15 20:42:45.204026] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:52.947 [2024-07-15 20:42:45.204068] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:52.947 [2024-07-15 20:42:45.204080] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26b75d0 name raid_bdev1, state offline 00:28:52.947 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:52.947 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:53.205 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:53.205 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:53.205 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:53.205 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:53.205 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:53.464 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:53.464 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:53.464 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:53.464 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:53.464 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:28:53.464 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:53.723 [2024-07-15 20:42:45.949895] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:53.723 [2024-07-15 20:42:45.949950] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:53.723 [2024-07-15 20:42:45.949970] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26b5660 00:28:53.723 [2024-07-15 20:42:45.949983] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:53.723 [2024-07-15 20:42:45.951421] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:53.723 [2024-07-15 20:42:45.951447] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:53.723 [2024-07-15 20:42:45.951492] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:53.723 [2024-07-15 20:42:45.951517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:53.723 [2024-07-15 20:42:45.951597] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26b7d10 00:28:53.723 [2024-07-15 20:42:45.951608] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:53.723 [2024-07-15 20:42:45.951665] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26b8560 00:28:53.723 [2024-07-15 20:42:45.951761] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26b7d10 00:28:53.723 [2024-07-15 20:42:45.951777] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26b7d10 00:28:53.723 [2024-07-15 20:42:45.951847] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:53.723 pt2 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:53.723 20:42:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:53.981 20:42:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:53.981 "name": "raid_bdev1", 00:28:53.981 "uuid": "0dc6ffdc-5b03-4ba9-881a-35f42641e99d", 00:28:53.981 "strip_size_kb": 0, 00:28:53.981 "state": "online", 00:28:53.981 "raid_level": "raid1", 00:28:53.981 "superblock": true, 00:28:53.982 "num_base_bdevs": 2, 00:28:53.982 "num_base_bdevs_discovered": 1, 00:28:53.982 "num_base_bdevs_operational": 1, 00:28:53.982 "base_bdevs_list": [ 00:28:53.982 { 00:28:53.982 "name": null, 00:28:53.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:53.982 "is_configured": false, 00:28:53.982 "data_offset": 256, 00:28:53.982 "data_size": 7936 00:28:53.982 }, 00:28:53.982 { 00:28:53.982 "name": "pt2", 00:28:53.982 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:53.982 "is_configured": true, 00:28:53.982 "data_offset": 256, 00:28:53.982 "data_size": 7936 00:28:53.982 } 00:28:53.982 ] 00:28:53.982 }' 00:28:53.982 20:42:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:53.982 20:42:46 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:54.549 20:42:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:54.549 [2024-07-15 20:42:46.916453] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:54.549 [2024-07-15 20:42:46.916481] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:54.549 [2024-07-15 20:42:46.916535] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:54.549 [2024-07-15 20:42:46.916578] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:54.549 [2024-07-15 20:42:46.916590] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26b7d10 name raid_bdev1, state offline 00:28:54.807 20:42:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:54.807 20:42:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.066 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:55.066 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:55.066 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:55.066 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:55.066 [2024-07-15 20:42:47.421772] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:55.066 [2024-07-15 20:42:47.421817] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:55.066 [2024-07-15 20:42:47.421836] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26b6760 00:28:55.066 [2024-07-15 20:42:47.421848] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:55.066 [2024-07-15 20:42:47.423261] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:55.066 [2024-07-15 20:42:47.423288] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:55.066 [2024-07-15 20:42:47.423334] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:55.066 [2024-07-15 20:42:47.423358] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:55.066 [2024-07-15 20:42:47.423446] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:55.066 [2024-07-15 20:42:47.423459] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:55.066 [2024-07-15 20:42:47.423473] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26b8850 name raid_bdev1, state configuring 00:28:55.066 [2024-07-15 20:42:47.423496] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:55.066 [2024-07-15 20:42:47.423548] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26b7850 00:28:55.066 [2024-07-15 20:42:47.423557] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:55.066 [2024-07-15 20:42:47.423611] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26b83b0 00:28:55.066 [2024-07-15 20:42:47.423706] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26b7850 00:28:55.066 [2024-07-15 20:42:47.423716] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26b7850 00:28:55.066 [2024-07-15 20:42:47.423787] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:55.066 pt1 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:55.325 "name": "raid_bdev1", 00:28:55.325 "uuid": "0dc6ffdc-5b03-4ba9-881a-35f42641e99d", 00:28:55.325 "strip_size_kb": 0, 00:28:55.325 "state": "online", 00:28:55.325 "raid_level": "raid1", 00:28:55.325 "superblock": true, 00:28:55.325 "num_base_bdevs": 2, 00:28:55.325 "num_base_bdevs_discovered": 1, 00:28:55.325 "num_base_bdevs_operational": 1, 00:28:55.325 "base_bdevs_list": [ 00:28:55.325 { 00:28:55.325 "name": null, 00:28:55.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:55.325 "is_configured": false, 00:28:55.325 "data_offset": 256, 00:28:55.325 "data_size": 7936 00:28:55.325 }, 00:28:55.325 { 00:28:55.325 "name": "pt2", 00:28:55.325 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:55.325 "is_configured": true, 00:28:55.325 "data_offset": 256, 00:28:55.325 "data_size": 7936 00:28:55.325 } 00:28:55.325 ] 00:28:55.325 }' 00:28:55.325 20:42:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:55.584 20:42:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:55.892 20:42:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:55.892 20:42:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:56.176 20:42:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:56.176 20:42:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:56.176 20:42:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:56.745 [2024-07-15 20:42:48.978138] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 0dc6ffdc-5b03-4ba9-881a-35f42641e99d '!=' 0dc6ffdc-5b03-4ba9-881a-35f42641e99d ']' 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 1503082 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1503082 ']' 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 1503082 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1503082 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1503082' 00:28:56.745 killing process with pid 1503082 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 1503082 00:28:56.745 [2024-07-15 20:42:49.061405] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:56.745 [2024-07-15 20:42:49.061456] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:56.745 [2024-07-15 20:42:49.061501] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:56.745 [2024-07-15 20:42:49.061513] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26b7850 name raid_bdev1, state offline 00:28:56.745 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 1503082 00:28:56.745 [2024-07-15 20:42:49.084008] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:57.003 20:42:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:28:57.003 00:28:57.003 real 0m16.162s 00:28:57.003 user 0m29.453s 00:28:57.003 sys 0m2.922s 00:28:57.003 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:57.004 20:42:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:57.004 ************************************ 00:28:57.004 END TEST raid_superblock_test_md_separate 00:28:57.004 ************************************ 00:28:57.004 20:42:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:57.004 20:42:49 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:28:57.004 20:42:49 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:28:57.004 20:42:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:57.004 20:42:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:57.004 20:42:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:57.004 ************************************ 00:28:57.004 START TEST raid_rebuild_test_sb_md_separate 00:28:57.004 ************************************ 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=1505496 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 1505496 /var/tmp/spdk-raid.sock 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:57.004 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1505496 ']' 00:28:57.262 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:57.262 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:57.262 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:57.262 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:57.262 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:57.262 20:42:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:57.262 [2024-07-15 20:42:49.433503] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:28:57.262 [2024-07-15 20:42:49.433556] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1505496 ] 00:28:57.262 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:57.262 Zero copy mechanism will not be used. 00:28:57.262 [2024-07-15 20:42:49.545959] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:57.520 [2024-07-15 20:42:49.654218] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:57.520 [2024-07-15 20:42:49.719313] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:57.520 [2024-07-15 20:42:49.719354] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:58.087 20:42:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:58.087 20:42:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:58.087 20:42:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:58.087 20:42:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:28:58.346 BaseBdev1_malloc 00:28:58.346 20:42:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:58.913 [2024-07-15 20:42:51.158248] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:58.913 [2024-07-15 20:42:51.158299] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:58.913 [2024-07-15 20:42:51.158327] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b826d0 00:28:58.913 [2024-07-15 20:42:51.158340] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:58.913 [2024-07-15 20:42:51.159831] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:58.913 [2024-07-15 20:42:51.159861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:58.913 BaseBdev1 00:28:58.913 20:42:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:58.913 20:42:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:28:59.170 BaseBdev2_malloc 00:28:59.170 20:42:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:59.736 [2024-07-15 20:42:51.977886] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:59.736 [2024-07-15 20:42:51.977945] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:59.736 [2024-07-15 20:42:51.977974] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cda1f0 00:28:59.736 [2024-07-15 20:42:51.977987] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:59.736 [2024-07-15 20:42:51.979444] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:59.736 [2024-07-15 20:42:51.979472] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:59.736 BaseBdev2 00:28:59.736 20:42:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:28:59.994 spare_malloc 00:28:59.994 20:42:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:00.253 spare_delay 00:29:00.253 20:42:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:00.511 [2024-07-15 20:42:52.878936] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:00.511 [2024-07-15 20:42:52.878983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:00.511 [2024-07-15 20:42:52.879010] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd67a0 00:29:00.511 [2024-07-15 20:42:52.879023] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:00.511 [2024-07-15 20:42:52.880446] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:00.511 [2024-07-15 20:42:52.880476] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:00.511 spare 00:29:00.769 20:42:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:00.769 [2024-07-15 20:42:53.127603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:00.769 [2024-07-15 20:42:53.128939] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:00.769 [2024-07-15 20:42:53.129109] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cd71c0 00:29:00.769 [2024-07-15 20:42:53.129122] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:00.769 [2024-07-15 20:42:53.129201] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1be8360 00:29:00.769 [2024-07-15 20:42:53.129317] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cd71c0 00:29:00.769 [2024-07-15 20:42:53.129326] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cd71c0 00:29:00.769 [2024-07-15 20:42:53.129396] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:00.769 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:00.769 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:01.028 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:01.028 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:01.028 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:01.028 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:01.028 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:01.028 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:01.028 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:01.028 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:01.028 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.028 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:01.287 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:01.287 "name": "raid_bdev1", 00:29:01.287 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:01.287 "strip_size_kb": 0, 00:29:01.287 "state": "online", 00:29:01.287 "raid_level": "raid1", 00:29:01.287 "superblock": true, 00:29:01.287 "num_base_bdevs": 2, 00:29:01.287 "num_base_bdevs_discovered": 2, 00:29:01.287 "num_base_bdevs_operational": 2, 00:29:01.287 "base_bdevs_list": [ 00:29:01.287 { 00:29:01.287 "name": "BaseBdev1", 00:29:01.287 "uuid": "07490910-1248-5fe0-b2f8-3a4f562f12d5", 00:29:01.287 "is_configured": true, 00:29:01.287 "data_offset": 256, 00:29:01.287 "data_size": 7936 00:29:01.287 }, 00:29:01.287 { 00:29:01.287 "name": "BaseBdev2", 00:29:01.287 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:01.287 "is_configured": true, 00:29:01.287 "data_offset": 256, 00:29:01.287 "data_size": 7936 00:29:01.287 } 00:29:01.287 ] 00:29:01.287 }' 00:29:01.287 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:01.287 20:42:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:02.223 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:02.223 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:29:02.223 [2024-07-15 20:42:54.491616] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:02.223 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:29:02.223 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:02.223 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:29:02.482 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:02.483 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:02.483 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:29:02.742 [2024-07-15 20:42:54.928549] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1be8360 00:29:02.742 /dev/nbd0 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:02.742 1+0 records in 00:29:02.742 1+0 records out 00:29:02.742 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271534 s, 15.1 MB/s 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:29:02.742 20:42:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:29:03.678 7936+0 records in 00:29:03.678 7936+0 records out 00:29:03.678 32505856 bytes (33 MB, 31 MiB) copied, 0.755021 s, 43.1 MB/s 00:29:03.678 20:42:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:29:03.678 20:42:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:03.678 20:42:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:03.678 20:42:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:03.678 20:42:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:29:03.678 20:42:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:03.678 20:42:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:03.678 [2024-07-15 20:42:56.028344] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:03.678 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:03.678 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:03.678 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:03.678 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:03.678 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:03.678 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:03.678 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:03.678 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:03.678 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:03.936 [2024-07-15 20:42:56.269019] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.936 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:04.194 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:04.194 "name": "raid_bdev1", 00:29:04.194 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:04.194 "strip_size_kb": 0, 00:29:04.194 "state": "online", 00:29:04.194 "raid_level": "raid1", 00:29:04.194 "superblock": true, 00:29:04.194 "num_base_bdevs": 2, 00:29:04.194 "num_base_bdevs_discovered": 1, 00:29:04.194 "num_base_bdevs_operational": 1, 00:29:04.194 "base_bdevs_list": [ 00:29:04.194 { 00:29:04.194 "name": null, 00:29:04.194 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:04.194 "is_configured": false, 00:29:04.194 "data_offset": 256, 00:29:04.194 "data_size": 7936 00:29:04.194 }, 00:29:04.194 { 00:29:04.194 "name": "BaseBdev2", 00:29:04.194 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:04.194 "is_configured": true, 00:29:04.194 "data_offset": 256, 00:29:04.194 "data_size": 7936 00:29:04.194 } 00:29:04.194 ] 00:29:04.194 }' 00:29:04.194 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:04.194 20:42:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:05.130 20:42:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:05.130 [2024-07-15 20:42:57.371960] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:05.130 [2024-07-15 20:42:57.374311] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b81350 00:29:05.130 [2024-07-15 20:42:57.376609] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:05.130 20:42:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:06.066 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:06.066 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:06.066 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:06.066 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:06.066 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:06.066 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.066 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.325 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:06.325 "name": "raid_bdev1", 00:29:06.325 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:06.325 "strip_size_kb": 0, 00:29:06.325 "state": "online", 00:29:06.325 "raid_level": "raid1", 00:29:06.325 "superblock": true, 00:29:06.325 "num_base_bdevs": 2, 00:29:06.325 "num_base_bdevs_discovered": 2, 00:29:06.325 "num_base_bdevs_operational": 2, 00:29:06.325 "process": { 00:29:06.325 "type": "rebuild", 00:29:06.325 "target": "spare", 00:29:06.325 "progress": { 00:29:06.325 "blocks": 3072, 00:29:06.325 "percent": 38 00:29:06.325 } 00:29:06.325 }, 00:29:06.325 "base_bdevs_list": [ 00:29:06.325 { 00:29:06.325 "name": "spare", 00:29:06.325 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:06.325 "is_configured": true, 00:29:06.325 "data_offset": 256, 00:29:06.325 "data_size": 7936 00:29:06.325 }, 00:29:06.325 { 00:29:06.325 "name": "BaseBdev2", 00:29:06.325 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:06.325 "is_configured": true, 00:29:06.325 "data_offset": 256, 00:29:06.325 "data_size": 7936 00:29:06.325 } 00:29:06.325 ] 00:29:06.325 }' 00:29:06.325 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:06.325 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:06.325 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:06.583 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:06.583 20:42:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:06.841 [2024-07-15 20:42:58.965782] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:06.841 [2024-07-15 20:42:58.988976] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:06.841 [2024-07-15 20:42:58.989024] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:06.841 [2024-07-15 20:42:58.989039] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:06.841 [2024-07-15 20:42:58.989048] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.841 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.099 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:07.099 "name": "raid_bdev1", 00:29:07.099 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:07.099 "strip_size_kb": 0, 00:29:07.099 "state": "online", 00:29:07.099 "raid_level": "raid1", 00:29:07.099 "superblock": true, 00:29:07.099 "num_base_bdevs": 2, 00:29:07.099 "num_base_bdevs_discovered": 1, 00:29:07.099 "num_base_bdevs_operational": 1, 00:29:07.099 "base_bdevs_list": [ 00:29:07.099 { 00:29:07.099 "name": null, 00:29:07.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:07.099 "is_configured": false, 00:29:07.099 "data_offset": 256, 00:29:07.099 "data_size": 7936 00:29:07.099 }, 00:29:07.099 { 00:29:07.099 "name": "BaseBdev2", 00:29:07.099 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:07.099 "is_configured": true, 00:29:07.099 "data_offset": 256, 00:29:07.099 "data_size": 7936 00:29:07.099 } 00:29:07.099 ] 00:29:07.099 }' 00:29:07.099 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:07.099 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:07.666 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:07.666 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:07.666 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:07.666 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:07.666 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:07.666 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.666 20:42:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.925 20:43:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:07.925 "name": "raid_bdev1", 00:29:07.925 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:07.925 "strip_size_kb": 0, 00:29:07.925 "state": "online", 00:29:07.925 "raid_level": "raid1", 00:29:07.925 "superblock": true, 00:29:07.925 "num_base_bdevs": 2, 00:29:07.925 "num_base_bdevs_discovered": 1, 00:29:07.925 "num_base_bdevs_operational": 1, 00:29:07.925 "base_bdevs_list": [ 00:29:07.925 { 00:29:07.925 "name": null, 00:29:07.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:07.925 "is_configured": false, 00:29:07.925 "data_offset": 256, 00:29:07.925 "data_size": 7936 00:29:07.925 }, 00:29:07.925 { 00:29:07.925 "name": "BaseBdev2", 00:29:07.925 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:07.925 "is_configured": true, 00:29:07.925 "data_offset": 256, 00:29:07.925 "data_size": 7936 00:29:07.925 } 00:29:07.925 ] 00:29:07.925 }' 00:29:07.925 20:43:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:07.925 20:43:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:07.925 20:43:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:07.925 20:43:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:07.925 20:43:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:08.184 [2024-07-15 20:43:00.459951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:08.184 [2024-07-15 20:43:00.462265] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b82280 00:29:08.184 [2024-07-15 20:43:00.463815] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:08.184 20:43:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:09.118 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:09.118 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:09.118 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:09.118 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:09.118 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:09.118 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.118 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:09.375 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:09.375 "name": "raid_bdev1", 00:29:09.375 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:09.375 "strip_size_kb": 0, 00:29:09.375 "state": "online", 00:29:09.375 "raid_level": "raid1", 00:29:09.375 "superblock": true, 00:29:09.375 "num_base_bdevs": 2, 00:29:09.375 "num_base_bdevs_discovered": 2, 00:29:09.375 "num_base_bdevs_operational": 2, 00:29:09.375 "process": { 00:29:09.375 "type": "rebuild", 00:29:09.375 "target": "spare", 00:29:09.375 "progress": { 00:29:09.375 "blocks": 3072, 00:29:09.375 "percent": 38 00:29:09.375 } 00:29:09.375 }, 00:29:09.375 "base_bdevs_list": [ 00:29:09.375 { 00:29:09.375 "name": "spare", 00:29:09.375 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:09.375 "is_configured": true, 00:29:09.375 "data_offset": 256, 00:29:09.375 "data_size": 7936 00:29:09.375 }, 00:29:09.375 { 00:29:09.375 "name": "BaseBdev2", 00:29:09.375 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:09.375 "is_configured": true, 00:29:09.375 "data_offset": 256, 00:29:09.375 "data_size": 7936 00:29:09.375 } 00:29:09.375 ] 00:29:09.375 }' 00:29:09.375 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:29:09.632 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1119 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.632 20:43:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:09.889 20:43:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:09.889 "name": "raid_bdev1", 00:29:09.889 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:09.889 "strip_size_kb": 0, 00:29:09.889 "state": "online", 00:29:09.889 "raid_level": "raid1", 00:29:09.889 "superblock": true, 00:29:09.889 "num_base_bdevs": 2, 00:29:09.889 "num_base_bdevs_discovered": 2, 00:29:09.889 "num_base_bdevs_operational": 2, 00:29:09.889 "process": { 00:29:09.889 "type": "rebuild", 00:29:09.889 "target": "spare", 00:29:09.889 "progress": { 00:29:09.889 "blocks": 3840, 00:29:09.889 "percent": 48 00:29:09.889 } 00:29:09.889 }, 00:29:09.889 "base_bdevs_list": [ 00:29:09.889 { 00:29:09.889 "name": "spare", 00:29:09.889 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:09.889 "is_configured": true, 00:29:09.889 "data_offset": 256, 00:29:09.889 "data_size": 7936 00:29:09.889 }, 00:29:09.889 { 00:29:09.889 "name": "BaseBdev2", 00:29:09.889 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:09.889 "is_configured": true, 00:29:09.889 "data_offset": 256, 00:29:09.889 "data_size": 7936 00:29:09.889 } 00:29:09.890 ] 00:29:09.890 }' 00:29:09.890 20:43:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:09.890 20:43:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:09.890 20:43:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:09.890 20:43:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:09.890 20:43:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:10.825 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:10.825 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:10.825 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:10.825 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:10.825 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:10.825 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:10.825 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.825 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:11.100 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:11.101 "name": "raid_bdev1", 00:29:11.101 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:11.101 "strip_size_kb": 0, 00:29:11.101 "state": "online", 00:29:11.101 "raid_level": "raid1", 00:29:11.101 "superblock": true, 00:29:11.101 "num_base_bdevs": 2, 00:29:11.101 "num_base_bdevs_discovered": 2, 00:29:11.101 "num_base_bdevs_operational": 2, 00:29:11.101 "process": { 00:29:11.101 "type": "rebuild", 00:29:11.101 "target": "spare", 00:29:11.101 "progress": { 00:29:11.101 "blocks": 7424, 00:29:11.101 "percent": 93 00:29:11.101 } 00:29:11.101 }, 00:29:11.101 "base_bdevs_list": [ 00:29:11.101 { 00:29:11.101 "name": "spare", 00:29:11.101 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:11.101 "is_configured": true, 00:29:11.101 "data_offset": 256, 00:29:11.101 "data_size": 7936 00:29:11.101 }, 00:29:11.101 { 00:29:11.101 "name": "BaseBdev2", 00:29:11.101 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:11.101 "is_configured": true, 00:29:11.101 "data_offset": 256, 00:29:11.101 "data_size": 7936 00:29:11.101 } 00:29:11.101 ] 00:29:11.101 }' 00:29:11.101 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:11.101 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:11.101 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:11.359 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:11.359 20:43:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:11.359 [2024-07-15 20:43:03.588283] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:11.359 [2024-07-15 20:43:03.588344] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:11.359 [2024-07-15 20:43:03.588429] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:12.293 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:12.293 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:12.293 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:12.293 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:12.293 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:12.293 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:12.293 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.293 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:12.551 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:12.551 "name": "raid_bdev1", 00:29:12.551 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:12.551 "strip_size_kb": 0, 00:29:12.551 "state": "online", 00:29:12.551 "raid_level": "raid1", 00:29:12.551 "superblock": true, 00:29:12.551 "num_base_bdevs": 2, 00:29:12.551 "num_base_bdevs_discovered": 2, 00:29:12.551 "num_base_bdevs_operational": 2, 00:29:12.551 "base_bdevs_list": [ 00:29:12.551 { 00:29:12.551 "name": "spare", 00:29:12.551 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:12.551 "is_configured": true, 00:29:12.551 "data_offset": 256, 00:29:12.551 "data_size": 7936 00:29:12.551 }, 00:29:12.551 { 00:29:12.551 "name": "BaseBdev2", 00:29:12.551 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:12.551 "is_configured": true, 00:29:12.551 "data_offset": 256, 00:29:12.551 "data_size": 7936 00:29:12.551 } 00:29:12.551 ] 00:29:12.551 }' 00:29:12.551 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:12.551 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:12.552 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:12.552 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:12.552 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:29:12.552 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:12.552 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:12.552 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:12.552 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:12.552 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:12.552 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.552 20:43:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:12.810 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:12.810 "name": "raid_bdev1", 00:29:12.810 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:12.810 "strip_size_kb": 0, 00:29:12.810 "state": "online", 00:29:12.810 "raid_level": "raid1", 00:29:12.810 "superblock": true, 00:29:12.810 "num_base_bdevs": 2, 00:29:12.810 "num_base_bdevs_discovered": 2, 00:29:12.810 "num_base_bdevs_operational": 2, 00:29:12.810 "base_bdevs_list": [ 00:29:12.810 { 00:29:12.810 "name": "spare", 00:29:12.810 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:12.810 "is_configured": true, 00:29:12.810 "data_offset": 256, 00:29:12.810 "data_size": 7936 00:29:12.810 }, 00:29:12.810 { 00:29:12.810 "name": "BaseBdev2", 00:29:12.810 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:12.810 "is_configured": true, 00:29:12.810 "data_offset": 256, 00:29:12.810 "data_size": 7936 00:29:12.810 } 00:29:12.810 ] 00:29:12.810 }' 00:29:12.810 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:12.810 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:12.810 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:13.069 "name": "raid_bdev1", 00:29:13.069 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:13.069 "strip_size_kb": 0, 00:29:13.069 "state": "online", 00:29:13.069 "raid_level": "raid1", 00:29:13.069 "superblock": true, 00:29:13.069 "num_base_bdevs": 2, 00:29:13.069 "num_base_bdevs_discovered": 2, 00:29:13.069 "num_base_bdevs_operational": 2, 00:29:13.069 "base_bdevs_list": [ 00:29:13.069 { 00:29:13.069 "name": "spare", 00:29:13.069 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:13.069 "is_configured": true, 00:29:13.069 "data_offset": 256, 00:29:13.069 "data_size": 7936 00:29:13.069 }, 00:29:13.069 { 00:29:13.069 "name": "BaseBdev2", 00:29:13.069 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:13.069 "is_configured": true, 00:29:13.069 "data_offset": 256, 00:29:13.069 "data_size": 7936 00:29:13.069 } 00:29:13.069 ] 00:29:13.069 }' 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:13.069 20:43:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:13.688 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:13.972 [2024-07-15 20:43:06.271312] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:13.972 [2024-07-15 20:43:06.271341] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:13.972 [2024-07-15 20:43:06.271399] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:13.972 [2024-07-15 20:43:06.271456] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:13.972 [2024-07-15 20:43:06.271468] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cd71c0 name raid_bdev1, state offline 00:29:13.972 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.972 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:14.229 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:29:14.487 /dev/nbd0 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:14.487 1+0 records in 00:29:14.487 1+0 records out 00:29:14.487 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224288 s, 18.3 MB/s 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:14.487 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:29:14.745 /dev/nbd1 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:14.745 1+0 records in 00:29:14.745 1+0 records out 00:29:14.745 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306391 s, 13.4 MB/s 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:14.745 20:43:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:29:14.745 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:29:14.745 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:14.745 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:14.745 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:14.745 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:29:14.745 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:14.745 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:15.004 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:15.004 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:15.004 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:15.004 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:15.004 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:15.004 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:15.004 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:15.004 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:15.004 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:15.004 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:29:15.261 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:15.261 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:15.261 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:15.261 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:15.261 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:15.261 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:15.261 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:15.261 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:15.261 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:29:15.261 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:15.520 [2024-07-15 20:43:07.824086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:15.520 [2024-07-15 20:43:07.824137] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:15.520 [2024-07-15 20:43:07.824162] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d1c110 00:29:15.520 [2024-07-15 20:43:07.824176] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:15.520 [2024-07-15 20:43:07.825661] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:15.520 [2024-07-15 20:43:07.825691] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:15.520 [2024-07-15 20:43:07.825748] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:15.520 [2024-07-15 20:43:07.825777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:15.520 [2024-07-15 20:43:07.825876] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:15.520 spare 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.520 20:43:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:15.777 [2024-07-15 20:43:07.926199] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1be8620 00:29:15.777 [2024-07-15 20:43:07.926216] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:15.777 [2024-07-15 20:43:07.926292] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd5660 00:29:15.777 [2024-07-15 20:43:07.926418] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1be8620 00:29:15.777 [2024-07-15 20:43:07.926428] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1be8620 00:29:15.777 [2024-07-15 20:43:07.926502] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:15.777 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:15.777 "name": "raid_bdev1", 00:29:15.777 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:15.777 "strip_size_kb": 0, 00:29:15.777 "state": "online", 00:29:15.777 "raid_level": "raid1", 00:29:15.777 "superblock": true, 00:29:15.777 "num_base_bdevs": 2, 00:29:15.777 "num_base_bdevs_discovered": 2, 00:29:15.777 "num_base_bdevs_operational": 2, 00:29:15.777 "base_bdevs_list": [ 00:29:15.777 { 00:29:15.777 "name": "spare", 00:29:15.777 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:15.777 "is_configured": true, 00:29:15.777 "data_offset": 256, 00:29:15.777 "data_size": 7936 00:29:15.777 }, 00:29:15.777 { 00:29:15.777 "name": "BaseBdev2", 00:29:15.777 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:15.777 "is_configured": true, 00:29:15.777 "data_offset": 256, 00:29:15.777 "data_size": 7936 00:29:15.777 } 00:29:15.777 ] 00:29:15.777 }' 00:29:15.777 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:15.777 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:16.341 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:16.341 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:16.341 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:16.341 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:16.341 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:16.341 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.341 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:16.599 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:16.599 "name": "raid_bdev1", 00:29:16.599 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:16.599 "strip_size_kb": 0, 00:29:16.599 "state": "online", 00:29:16.599 "raid_level": "raid1", 00:29:16.599 "superblock": true, 00:29:16.599 "num_base_bdevs": 2, 00:29:16.599 "num_base_bdevs_discovered": 2, 00:29:16.599 "num_base_bdevs_operational": 2, 00:29:16.599 "base_bdevs_list": [ 00:29:16.599 { 00:29:16.599 "name": "spare", 00:29:16.599 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:16.599 "is_configured": true, 00:29:16.599 "data_offset": 256, 00:29:16.599 "data_size": 7936 00:29:16.599 }, 00:29:16.599 { 00:29:16.599 "name": "BaseBdev2", 00:29:16.599 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:16.599 "is_configured": true, 00:29:16.599 "data_offset": 256, 00:29:16.599 "data_size": 7936 00:29:16.599 } 00:29:16.599 ] 00:29:16.599 }' 00:29:16.599 20:43:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:16.857 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:16.857 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:16.857 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:16.857 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.857 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:17.115 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:17.115 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:17.373 [2024-07-15 20:43:09.524727] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:17.373 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:17.373 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:17.373 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:17.373 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:17.373 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:17.373 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:17.373 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:17.373 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:17.373 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:17.374 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:17.374 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.374 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.632 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.632 "name": "raid_bdev1", 00:29:17.632 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:17.632 "strip_size_kb": 0, 00:29:17.632 "state": "online", 00:29:17.632 "raid_level": "raid1", 00:29:17.632 "superblock": true, 00:29:17.632 "num_base_bdevs": 2, 00:29:17.632 "num_base_bdevs_discovered": 1, 00:29:17.632 "num_base_bdevs_operational": 1, 00:29:17.632 "base_bdevs_list": [ 00:29:17.632 { 00:29:17.632 "name": null, 00:29:17.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:17.632 "is_configured": false, 00:29:17.632 "data_offset": 256, 00:29:17.632 "data_size": 7936 00:29:17.632 }, 00:29:17.632 { 00:29:17.632 "name": "BaseBdev2", 00:29:17.632 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:17.632 "is_configured": true, 00:29:17.632 "data_offset": 256, 00:29:17.632 "data_size": 7936 00:29:17.632 } 00:29:17.632 ] 00:29:17.632 }' 00:29:17.632 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.632 20:43:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:18.199 20:43:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:18.457 [2024-07-15 20:43:10.707882] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:18.457 [2024-07-15 20:43:10.708060] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:18.457 [2024-07-15 20:43:10.708078] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:18.457 [2024-07-15 20:43:10.708107] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:18.457 [2024-07-15 20:43:10.710324] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d1cab0 00:29:18.457 [2024-07-15 20:43:10.711661] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:18.457 20:43:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:19.391 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:19.391 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:19.391 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:19.391 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:19.391 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:19.391 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:19.391 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:19.650 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:19.650 "name": "raid_bdev1", 00:29:19.650 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:19.650 "strip_size_kb": 0, 00:29:19.650 "state": "online", 00:29:19.650 "raid_level": "raid1", 00:29:19.650 "superblock": true, 00:29:19.650 "num_base_bdevs": 2, 00:29:19.650 "num_base_bdevs_discovered": 2, 00:29:19.650 "num_base_bdevs_operational": 2, 00:29:19.650 "process": { 00:29:19.650 "type": "rebuild", 00:29:19.650 "target": "spare", 00:29:19.650 "progress": { 00:29:19.650 "blocks": 2816, 00:29:19.650 "percent": 35 00:29:19.650 } 00:29:19.650 }, 00:29:19.650 "base_bdevs_list": [ 00:29:19.650 { 00:29:19.650 "name": "spare", 00:29:19.650 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:19.650 "is_configured": true, 00:29:19.650 "data_offset": 256, 00:29:19.650 "data_size": 7936 00:29:19.650 }, 00:29:19.650 { 00:29:19.650 "name": "BaseBdev2", 00:29:19.650 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:19.650 "is_configured": true, 00:29:19.650 "data_offset": 256, 00:29:19.650 "data_size": 7936 00:29:19.650 } 00:29:19.650 ] 00:29:19.650 }' 00:29:19.650 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:19.650 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:19.650 20:43:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:19.650 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:19.650 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:19.908 [2024-07-15 20:43:12.237371] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:20.166 [2024-07-15 20:43:12.324357] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:20.166 [2024-07-15 20:43:12.324412] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:20.166 [2024-07-15 20:43:12.324428] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:20.166 [2024-07-15 20:43:12.324437] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.167 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:20.425 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:20.425 "name": "raid_bdev1", 00:29:20.425 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:20.425 "strip_size_kb": 0, 00:29:20.425 "state": "online", 00:29:20.425 "raid_level": "raid1", 00:29:20.425 "superblock": true, 00:29:20.425 "num_base_bdevs": 2, 00:29:20.425 "num_base_bdevs_discovered": 1, 00:29:20.425 "num_base_bdevs_operational": 1, 00:29:20.425 "base_bdevs_list": [ 00:29:20.425 { 00:29:20.425 "name": null, 00:29:20.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:20.425 "is_configured": false, 00:29:20.425 "data_offset": 256, 00:29:20.425 "data_size": 7936 00:29:20.425 }, 00:29:20.425 { 00:29:20.425 "name": "BaseBdev2", 00:29:20.425 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:20.425 "is_configured": true, 00:29:20.425 "data_offset": 256, 00:29:20.425 "data_size": 7936 00:29:20.425 } 00:29:20.425 ] 00:29:20.425 }' 00:29:20.425 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:20.425 20:43:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:20.991 20:43:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:21.250 [2024-07-15 20:43:13.462902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:21.250 [2024-07-15 20:43:13.462967] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:21.250 [2024-07-15 20:43:13.462999] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bea570 00:29:21.250 [2024-07-15 20:43:13.463013] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:21.250 [2024-07-15 20:43:13.463248] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:21.250 [2024-07-15 20:43:13.463265] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:21.250 [2024-07-15 20:43:13.463325] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:21.250 [2024-07-15 20:43:13.463338] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:21.250 [2024-07-15 20:43:13.463349] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:21.250 [2024-07-15 20:43:13.463367] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:21.250 [2024-07-15 20:43:13.465607] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bea800 00:29:21.250 [2024-07-15 20:43:13.466959] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:21.250 spare 00:29:21.250 20:43:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:22.183 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:22.183 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:22.183 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:22.183 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:22.183 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:22.183 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:22.183 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:22.440 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:22.440 "name": "raid_bdev1", 00:29:22.441 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:22.441 "strip_size_kb": 0, 00:29:22.441 "state": "online", 00:29:22.441 "raid_level": "raid1", 00:29:22.441 "superblock": true, 00:29:22.441 "num_base_bdevs": 2, 00:29:22.441 "num_base_bdevs_discovered": 2, 00:29:22.441 "num_base_bdevs_operational": 2, 00:29:22.441 "process": { 00:29:22.441 "type": "rebuild", 00:29:22.441 "target": "spare", 00:29:22.441 "progress": { 00:29:22.441 "blocks": 3072, 00:29:22.441 "percent": 38 00:29:22.441 } 00:29:22.441 }, 00:29:22.441 "base_bdevs_list": [ 00:29:22.441 { 00:29:22.441 "name": "spare", 00:29:22.441 "uuid": "cdbd1a19-0aa7-5a09-becd-696a86f81e3f", 00:29:22.441 "is_configured": true, 00:29:22.441 "data_offset": 256, 00:29:22.441 "data_size": 7936 00:29:22.441 }, 00:29:22.441 { 00:29:22.441 "name": "BaseBdev2", 00:29:22.441 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:22.441 "is_configured": true, 00:29:22.441 "data_offset": 256, 00:29:22.441 "data_size": 7936 00:29:22.441 } 00:29:22.441 ] 00:29:22.441 }' 00:29:22.441 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:22.441 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:22.441 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:22.698 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:22.698 20:43:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:22.698 [2024-07-15 20:43:15.069422] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:22.957 [2024-07-15 20:43:15.079838] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:22.957 [2024-07-15 20:43:15.079893] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:22.957 [2024-07-15 20:43:15.079910] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:22.957 [2024-07-15 20:43:15.079919] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:22.957 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.215 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:23.215 "name": "raid_bdev1", 00:29:23.215 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:23.215 "strip_size_kb": 0, 00:29:23.215 "state": "online", 00:29:23.215 "raid_level": "raid1", 00:29:23.215 "superblock": true, 00:29:23.215 "num_base_bdevs": 2, 00:29:23.215 "num_base_bdevs_discovered": 1, 00:29:23.215 "num_base_bdevs_operational": 1, 00:29:23.215 "base_bdevs_list": [ 00:29:23.215 { 00:29:23.215 "name": null, 00:29:23.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.215 "is_configured": false, 00:29:23.215 "data_offset": 256, 00:29:23.215 "data_size": 7936 00:29:23.215 }, 00:29:23.215 { 00:29:23.215 "name": "BaseBdev2", 00:29:23.215 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:23.215 "is_configured": true, 00:29:23.215 "data_offset": 256, 00:29:23.215 "data_size": 7936 00:29:23.215 } 00:29:23.215 ] 00:29:23.215 }' 00:29:23.215 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:23.215 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:23.781 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:23.781 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:23.781 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:23.781 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:23.781 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:23.781 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.781 20:43:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:24.039 20:43:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:24.039 "name": "raid_bdev1", 00:29:24.039 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:24.039 "strip_size_kb": 0, 00:29:24.039 "state": "online", 00:29:24.039 "raid_level": "raid1", 00:29:24.039 "superblock": true, 00:29:24.039 "num_base_bdevs": 2, 00:29:24.039 "num_base_bdevs_discovered": 1, 00:29:24.039 "num_base_bdevs_operational": 1, 00:29:24.039 "base_bdevs_list": [ 00:29:24.039 { 00:29:24.039 "name": null, 00:29:24.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:24.039 "is_configured": false, 00:29:24.039 "data_offset": 256, 00:29:24.039 "data_size": 7936 00:29:24.039 }, 00:29:24.039 { 00:29:24.039 "name": "BaseBdev2", 00:29:24.039 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:24.039 "is_configured": true, 00:29:24.039 "data_offset": 256, 00:29:24.039 "data_size": 7936 00:29:24.039 } 00:29:24.039 ] 00:29:24.039 }' 00:29:24.039 20:43:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:24.039 20:43:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:24.039 20:43:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:24.039 20:43:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:24.039 20:43:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:24.297 20:43:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:24.555 [2024-07-15 20:43:16.755615] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:24.555 [2024-07-15 20:43:16.755666] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:24.555 [2024-07-15 20:43:16.755692] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b82900 00:29:24.555 [2024-07-15 20:43:16.755704] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:24.555 [2024-07-15 20:43:16.755906] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:24.555 [2024-07-15 20:43:16.756163] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:24.555 [2024-07-15 20:43:16.756215] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:24.555 [2024-07-15 20:43:16.756228] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:24.555 [2024-07-15 20:43:16.756239] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:24.555 BaseBdev1 00:29:24.555 20:43:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:25.488 20:43:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:25.746 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:25.746 "name": "raid_bdev1", 00:29:25.746 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:25.746 "strip_size_kb": 0, 00:29:25.746 "state": "online", 00:29:25.746 "raid_level": "raid1", 00:29:25.746 "superblock": true, 00:29:25.746 "num_base_bdevs": 2, 00:29:25.746 "num_base_bdevs_discovered": 1, 00:29:25.746 "num_base_bdevs_operational": 1, 00:29:25.746 "base_bdevs_list": [ 00:29:25.746 { 00:29:25.746 "name": null, 00:29:25.746 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:25.746 "is_configured": false, 00:29:25.746 "data_offset": 256, 00:29:25.746 "data_size": 7936 00:29:25.746 }, 00:29:25.746 { 00:29:25.746 "name": "BaseBdev2", 00:29:25.746 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:25.746 "is_configured": true, 00:29:25.746 "data_offset": 256, 00:29:25.746 "data_size": 7936 00:29:25.746 } 00:29:25.746 ] 00:29:25.746 }' 00:29:25.746 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:25.746 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:26.312 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:26.312 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:26.312 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:26.312 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:26.312 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:26.312 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:26.312 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:26.569 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:26.569 "name": "raid_bdev1", 00:29:26.569 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:26.569 "strip_size_kb": 0, 00:29:26.569 "state": "online", 00:29:26.569 "raid_level": "raid1", 00:29:26.569 "superblock": true, 00:29:26.569 "num_base_bdevs": 2, 00:29:26.569 "num_base_bdevs_discovered": 1, 00:29:26.569 "num_base_bdevs_operational": 1, 00:29:26.569 "base_bdevs_list": [ 00:29:26.569 { 00:29:26.569 "name": null, 00:29:26.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:26.569 "is_configured": false, 00:29:26.569 "data_offset": 256, 00:29:26.569 "data_size": 7936 00:29:26.569 }, 00:29:26.569 { 00:29:26.569 "name": "BaseBdev2", 00:29:26.569 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:26.569 "is_configured": true, 00:29:26.569 "data_offset": 256, 00:29:26.569 "data_size": 7936 00:29:26.569 } 00:29:26.569 ] 00:29:26.569 }' 00:29:26.569 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:26.569 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:26.569 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:26.826 20:43:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:27.082 [2024-07-15 20:43:19.458812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:27.082 [2024-07-15 20:43:19.458956] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:27.082 [2024-07-15 20:43:19.458973] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:27.339 request: 00:29:27.339 { 00:29:27.339 "base_bdev": "BaseBdev1", 00:29:27.339 "raid_bdev": "raid_bdev1", 00:29:27.339 "method": "bdev_raid_add_base_bdev", 00:29:27.339 "req_id": 1 00:29:27.339 } 00:29:27.339 Got JSON-RPC error response 00:29:27.339 response: 00:29:27.339 { 00:29:27.339 "code": -22, 00:29:27.339 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:27.339 } 00:29:27.339 20:43:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:29:27.339 20:43:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:27.339 20:43:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:27.339 20:43:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:27.339 20:43:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:28.269 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:28.527 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:28.527 "name": "raid_bdev1", 00:29:28.527 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:28.527 "strip_size_kb": 0, 00:29:28.527 "state": "online", 00:29:28.527 "raid_level": "raid1", 00:29:28.527 "superblock": true, 00:29:28.527 "num_base_bdevs": 2, 00:29:28.527 "num_base_bdevs_discovered": 1, 00:29:28.527 "num_base_bdevs_operational": 1, 00:29:28.527 "base_bdevs_list": [ 00:29:28.527 { 00:29:28.527 "name": null, 00:29:28.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:28.527 "is_configured": false, 00:29:28.527 "data_offset": 256, 00:29:28.527 "data_size": 7936 00:29:28.527 }, 00:29:28.527 { 00:29:28.527 "name": "BaseBdev2", 00:29:28.527 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:28.527 "is_configured": true, 00:29:28.527 "data_offset": 256, 00:29:28.527 "data_size": 7936 00:29:28.527 } 00:29:28.527 ] 00:29:28.527 }' 00:29:28.527 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:28.527 20:43:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:29.110 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:29.110 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:29.110 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:29.110 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:29.110 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:29.110 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:29.110 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:29.368 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:29.368 "name": "raid_bdev1", 00:29:29.368 "uuid": "9b7fbf89-575a-4a3b-8065-70eff8ec4bca", 00:29:29.368 "strip_size_kb": 0, 00:29:29.368 "state": "online", 00:29:29.368 "raid_level": "raid1", 00:29:29.368 "superblock": true, 00:29:29.368 "num_base_bdevs": 2, 00:29:29.368 "num_base_bdevs_discovered": 1, 00:29:29.368 "num_base_bdevs_operational": 1, 00:29:29.368 "base_bdevs_list": [ 00:29:29.368 { 00:29:29.368 "name": null, 00:29:29.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:29.368 "is_configured": false, 00:29:29.368 "data_offset": 256, 00:29:29.368 "data_size": 7936 00:29:29.368 }, 00:29:29.368 { 00:29:29.368 "name": "BaseBdev2", 00:29:29.368 "uuid": "c2928c0b-595a-547f-b50a-70ea7f778fe7", 00:29:29.368 "is_configured": true, 00:29:29.368 "data_offset": 256, 00:29:29.368 "data_size": 7936 00:29:29.369 } 00:29:29.369 ] 00:29:29.369 }' 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 1505496 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1505496 ']' 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 1505496 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1505496 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1505496' 00:29:29.369 killing process with pid 1505496 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 1505496 00:29:29.369 Received shutdown signal, test time was about 60.000000 seconds 00:29:29.369 00:29:29.369 Latency(us) 00:29:29.369 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:29.369 =================================================================================================================== 00:29:29.369 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:29.369 [2024-07-15 20:43:21.698840] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:29.369 [2024-07-15 20:43:21.698943] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:29.369 [2024-07-15 20:43:21.698990] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:29.369 [2024-07-15 20:43:21.699003] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1be8620 name raid_bdev1, state offline 00:29:29.369 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 1505496 00:29:29.369 [2024-07-15 20:43:21.732267] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:29.627 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:29:29.627 00:29:29.627 real 0m32.564s 00:29:29.627 user 0m51.211s 00:29:29.627 sys 0m5.184s 00:29:29.627 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:29.627 20:43:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:29.627 ************************************ 00:29:29.627 END TEST raid_rebuild_test_sb_md_separate 00:29:29.627 ************************************ 00:29:29.627 20:43:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:29.627 20:43:21 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:29:29.627 20:43:21 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:29:29.627 20:43:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:29:29.627 20:43:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:29.627 20:43:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:29.886 ************************************ 00:29:29.886 START TEST raid_state_function_test_sb_md_interleaved 00:29:29.886 ************************************ 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:29.886 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1510151 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1510151' 00:29:29.887 Process raid pid: 1510151 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1510151 /var/tmp/spdk-raid.sock 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1510151 ']' 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:29.887 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:29.887 20:43:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:29.887 [2024-07-15 20:43:22.092143] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:29:29.887 [2024-07-15 20:43:22.092210] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:29.887 [2024-07-15 20:43:22.235758] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:30.145 [2024-07-15 20:43:22.374378] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:30.145 [2024-07-15 20:43:22.440512] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:30.145 [2024-07-15 20:43:22.440552] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:30.747 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:30.747 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:30.747 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:31.313 [2024-07-15 20:43:23.604786] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:31.313 [2024-07-15 20:43:23.604831] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:31.313 [2024-07-15 20:43:23.604842] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:31.313 [2024-07-15 20:43:23.604854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:31.313 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:31.313 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:31.313 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:31.313 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:31.313 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:31.313 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:31.313 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:31.314 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:31.314 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:31.314 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:31.314 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:31.314 20:43:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:31.880 20:43:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:31.880 "name": "Existed_Raid", 00:29:31.880 "uuid": "5936348f-cd5c-4b34-8708-31493b78b2b0", 00:29:31.880 "strip_size_kb": 0, 00:29:31.880 "state": "configuring", 00:29:31.880 "raid_level": "raid1", 00:29:31.880 "superblock": true, 00:29:31.880 "num_base_bdevs": 2, 00:29:31.880 "num_base_bdevs_discovered": 0, 00:29:31.880 "num_base_bdevs_operational": 2, 00:29:31.880 "base_bdevs_list": [ 00:29:31.880 { 00:29:31.880 "name": "BaseBdev1", 00:29:31.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:31.880 "is_configured": false, 00:29:31.880 "data_offset": 0, 00:29:31.880 "data_size": 0 00:29:31.880 }, 00:29:31.880 { 00:29:31.880 "name": "BaseBdev2", 00:29:31.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:31.880 "is_configured": false, 00:29:31.880 "data_offset": 0, 00:29:31.880 "data_size": 0 00:29:31.880 } 00:29:31.880 ] 00:29:31.880 }' 00:29:31.880 20:43:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:31.880 20:43:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:32.446 20:43:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:32.705 [2024-07-15 20:43:24.984271] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:32.705 [2024-07-15 20:43:24.984304] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17e9a80 name Existed_Raid, state configuring 00:29:32.705 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:32.965 [2024-07-15 20:43:25.232965] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:32.965 [2024-07-15 20:43:25.232995] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:32.965 [2024-07-15 20:43:25.233005] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:32.965 [2024-07-15 20:43:25.233016] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:32.965 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:29:33.236 [2024-07-15 20:43:25.491832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:33.236 BaseBdev1 00:29:33.236 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:33.236 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:29:33.237 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:33.237 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:29:33.237 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:33.237 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:33.237 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:33.495 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:33.753 [ 00:29:33.753 { 00:29:33.753 "name": "BaseBdev1", 00:29:33.753 "aliases": [ 00:29:33.753 "ede359e8-7f1a-4f2f-8140-2a6e9014d60c" 00:29:33.753 ], 00:29:33.753 "product_name": "Malloc disk", 00:29:33.753 "block_size": 4128, 00:29:33.753 "num_blocks": 8192, 00:29:33.753 "uuid": "ede359e8-7f1a-4f2f-8140-2a6e9014d60c", 00:29:33.753 "md_size": 32, 00:29:33.753 "md_interleave": true, 00:29:33.753 "dif_type": 0, 00:29:33.753 "assigned_rate_limits": { 00:29:33.753 "rw_ios_per_sec": 0, 00:29:33.753 "rw_mbytes_per_sec": 0, 00:29:33.753 "r_mbytes_per_sec": 0, 00:29:33.753 "w_mbytes_per_sec": 0 00:29:33.753 }, 00:29:33.753 "claimed": true, 00:29:33.753 "claim_type": "exclusive_write", 00:29:33.753 "zoned": false, 00:29:33.753 "supported_io_types": { 00:29:33.753 "read": true, 00:29:33.753 "write": true, 00:29:33.753 "unmap": true, 00:29:33.753 "flush": true, 00:29:33.753 "reset": true, 00:29:33.753 "nvme_admin": false, 00:29:33.753 "nvme_io": false, 00:29:33.753 "nvme_io_md": false, 00:29:33.753 "write_zeroes": true, 00:29:33.753 "zcopy": true, 00:29:33.753 "get_zone_info": false, 00:29:33.753 "zone_management": false, 00:29:33.753 "zone_append": false, 00:29:33.753 "compare": false, 00:29:33.753 "compare_and_write": false, 00:29:33.753 "abort": true, 00:29:33.753 "seek_hole": false, 00:29:33.753 "seek_data": false, 00:29:33.753 "copy": true, 00:29:33.753 "nvme_iov_md": false 00:29:33.753 }, 00:29:33.753 "memory_domains": [ 00:29:33.753 { 00:29:33.753 "dma_device_id": "system", 00:29:33.753 "dma_device_type": 1 00:29:33.753 }, 00:29:33.753 { 00:29:33.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:33.753 "dma_device_type": 2 00:29:33.753 } 00:29:33.753 ], 00:29:33.753 "driver_specific": {} 00:29:33.753 } 00:29:33.753 ] 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.753 20:43:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:34.011 20:43:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:34.011 "name": "Existed_Raid", 00:29:34.011 "uuid": "1c74d620-6d99-4c58-a1d8-800f180fb693", 00:29:34.011 "strip_size_kb": 0, 00:29:34.011 "state": "configuring", 00:29:34.011 "raid_level": "raid1", 00:29:34.011 "superblock": true, 00:29:34.011 "num_base_bdevs": 2, 00:29:34.011 "num_base_bdevs_discovered": 1, 00:29:34.011 "num_base_bdevs_operational": 2, 00:29:34.011 "base_bdevs_list": [ 00:29:34.011 { 00:29:34.011 "name": "BaseBdev1", 00:29:34.011 "uuid": "ede359e8-7f1a-4f2f-8140-2a6e9014d60c", 00:29:34.011 "is_configured": true, 00:29:34.011 "data_offset": 256, 00:29:34.011 "data_size": 7936 00:29:34.011 }, 00:29:34.011 { 00:29:34.011 "name": "BaseBdev2", 00:29:34.011 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:34.011 "is_configured": false, 00:29:34.011 "data_offset": 0, 00:29:34.011 "data_size": 0 00:29:34.011 } 00:29:34.011 ] 00:29:34.011 }' 00:29:34.011 20:43:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:34.011 20:43:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:34.576 20:43:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:34.832 [2024-07-15 20:43:27.019945] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:34.832 [2024-07-15 20:43:27.019994] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17e9350 name Existed_Raid, state configuring 00:29:34.832 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:35.089 [2024-07-15 20:43:27.264632] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:35.089 [2024-07-15 20:43:27.266202] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:35.089 [2024-07-15 20:43:27.266238] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:35.089 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:35.347 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:35.347 "name": "Existed_Raid", 00:29:35.347 "uuid": "756492dd-1648-479e-93bb-85fde97f4d90", 00:29:35.347 "strip_size_kb": 0, 00:29:35.347 "state": "configuring", 00:29:35.347 "raid_level": "raid1", 00:29:35.347 "superblock": true, 00:29:35.347 "num_base_bdevs": 2, 00:29:35.347 "num_base_bdevs_discovered": 1, 00:29:35.347 "num_base_bdevs_operational": 2, 00:29:35.347 "base_bdevs_list": [ 00:29:35.347 { 00:29:35.347 "name": "BaseBdev1", 00:29:35.347 "uuid": "ede359e8-7f1a-4f2f-8140-2a6e9014d60c", 00:29:35.347 "is_configured": true, 00:29:35.347 "data_offset": 256, 00:29:35.347 "data_size": 7936 00:29:35.347 }, 00:29:35.347 { 00:29:35.347 "name": "BaseBdev2", 00:29:35.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:35.347 "is_configured": false, 00:29:35.347 "data_offset": 0, 00:29:35.347 "data_size": 0 00:29:35.347 } 00:29:35.347 ] 00:29:35.347 }' 00:29:35.347 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:35.347 20:43:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:35.910 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:29:36.167 [2024-07-15 20:43:28.359076] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:36.167 [2024-07-15 20:43:28.359217] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17eb180 00:29:36.167 [2024-07-15 20:43:28.359231] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:36.167 [2024-07-15 20:43:28.359293] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17eb150 00:29:36.167 [2024-07-15 20:43:28.359369] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17eb180 00:29:36.167 [2024-07-15 20:43:28.359379] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17eb180 00:29:36.167 [2024-07-15 20:43:28.359435] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:36.167 BaseBdev2 00:29:36.167 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:36.167 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:29:36.167 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:36.167 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:29:36.167 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:36.167 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:36.167 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:36.424 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:36.682 [ 00:29:36.682 { 00:29:36.682 "name": "BaseBdev2", 00:29:36.682 "aliases": [ 00:29:36.682 "d992b637-e94e-48a9-bc8b-fa787a0407aa" 00:29:36.682 ], 00:29:36.682 "product_name": "Malloc disk", 00:29:36.682 "block_size": 4128, 00:29:36.682 "num_blocks": 8192, 00:29:36.682 "uuid": "d992b637-e94e-48a9-bc8b-fa787a0407aa", 00:29:36.682 "md_size": 32, 00:29:36.682 "md_interleave": true, 00:29:36.682 "dif_type": 0, 00:29:36.682 "assigned_rate_limits": { 00:29:36.682 "rw_ios_per_sec": 0, 00:29:36.682 "rw_mbytes_per_sec": 0, 00:29:36.682 "r_mbytes_per_sec": 0, 00:29:36.682 "w_mbytes_per_sec": 0 00:29:36.682 }, 00:29:36.682 "claimed": true, 00:29:36.682 "claim_type": "exclusive_write", 00:29:36.682 "zoned": false, 00:29:36.682 "supported_io_types": { 00:29:36.682 "read": true, 00:29:36.682 "write": true, 00:29:36.682 "unmap": true, 00:29:36.682 "flush": true, 00:29:36.682 "reset": true, 00:29:36.682 "nvme_admin": false, 00:29:36.682 "nvme_io": false, 00:29:36.682 "nvme_io_md": false, 00:29:36.682 "write_zeroes": true, 00:29:36.682 "zcopy": true, 00:29:36.682 "get_zone_info": false, 00:29:36.682 "zone_management": false, 00:29:36.682 "zone_append": false, 00:29:36.682 "compare": false, 00:29:36.682 "compare_and_write": false, 00:29:36.682 "abort": true, 00:29:36.682 "seek_hole": false, 00:29:36.682 "seek_data": false, 00:29:36.683 "copy": true, 00:29:36.683 "nvme_iov_md": false 00:29:36.683 }, 00:29:36.683 "memory_domains": [ 00:29:36.683 { 00:29:36.683 "dma_device_id": "system", 00:29:36.683 "dma_device_type": 1 00:29:36.683 }, 00:29:36.683 { 00:29:36.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:36.683 "dma_device_type": 2 00:29:36.683 } 00:29:36.683 ], 00:29:36.683 "driver_specific": {} 00:29:36.683 } 00:29:36.683 ] 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.683 20:43:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:36.940 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:36.940 "name": "Existed_Raid", 00:29:36.940 "uuid": "756492dd-1648-479e-93bb-85fde97f4d90", 00:29:36.940 "strip_size_kb": 0, 00:29:36.940 "state": "online", 00:29:36.940 "raid_level": "raid1", 00:29:36.940 "superblock": true, 00:29:36.940 "num_base_bdevs": 2, 00:29:36.940 "num_base_bdevs_discovered": 2, 00:29:36.940 "num_base_bdevs_operational": 2, 00:29:36.940 "base_bdevs_list": [ 00:29:36.940 { 00:29:36.940 "name": "BaseBdev1", 00:29:36.940 "uuid": "ede359e8-7f1a-4f2f-8140-2a6e9014d60c", 00:29:36.940 "is_configured": true, 00:29:36.940 "data_offset": 256, 00:29:36.940 "data_size": 7936 00:29:36.940 }, 00:29:36.940 { 00:29:36.940 "name": "BaseBdev2", 00:29:36.940 "uuid": "d992b637-e94e-48a9-bc8b-fa787a0407aa", 00:29:36.940 "is_configured": true, 00:29:36.940 "data_offset": 256, 00:29:36.940 "data_size": 7936 00:29:36.940 } 00:29:36.940 ] 00:29:36.940 }' 00:29:36.940 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:36.940 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:37.504 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:37.504 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:37.504 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:37.504 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:37.504 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:37.504 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:37.504 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:37.504 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:37.762 [2024-07-15 20:43:29.963658] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:37.762 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:37.762 "name": "Existed_Raid", 00:29:37.762 "aliases": [ 00:29:37.762 "756492dd-1648-479e-93bb-85fde97f4d90" 00:29:37.762 ], 00:29:37.762 "product_name": "Raid Volume", 00:29:37.762 "block_size": 4128, 00:29:37.762 "num_blocks": 7936, 00:29:37.762 "uuid": "756492dd-1648-479e-93bb-85fde97f4d90", 00:29:37.762 "md_size": 32, 00:29:37.762 "md_interleave": true, 00:29:37.762 "dif_type": 0, 00:29:37.762 "assigned_rate_limits": { 00:29:37.762 "rw_ios_per_sec": 0, 00:29:37.762 "rw_mbytes_per_sec": 0, 00:29:37.762 "r_mbytes_per_sec": 0, 00:29:37.762 "w_mbytes_per_sec": 0 00:29:37.762 }, 00:29:37.762 "claimed": false, 00:29:37.762 "zoned": false, 00:29:37.762 "supported_io_types": { 00:29:37.762 "read": true, 00:29:37.762 "write": true, 00:29:37.762 "unmap": false, 00:29:37.762 "flush": false, 00:29:37.762 "reset": true, 00:29:37.762 "nvme_admin": false, 00:29:37.762 "nvme_io": false, 00:29:37.762 "nvme_io_md": false, 00:29:37.762 "write_zeroes": true, 00:29:37.762 "zcopy": false, 00:29:37.762 "get_zone_info": false, 00:29:37.762 "zone_management": false, 00:29:37.762 "zone_append": false, 00:29:37.762 "compare": false, 00:29:37.762 "compare_and_write": false, 00:29:37.762 "abort": false, 00:29:37.762 "seek_hole": false, 00:29:37.762 "seek_data": false, 00:29:37.762 "copy": false, 00:29:37.762 "nvme_iov_md": false 00:29:37.762 }, 00:29:37.762 "memory_domains": [ 00:29:37.762 { 00:29:37.762 "dma_device_id": "system", 00:29:37.762 "dma_device_type": 1 00:29:37.762 }, 00:29:37.762 { 00:29:37.762 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:37.762 "dma_device_type": 2 00:29:37.762 }, 00:29:37.762 { 00:29:37.762 "dma_device_id": "system", 00:29:37.762 "dma_device_type": 1 00:29:37.762 }, 00:29:37.762 { 00:29:37.762 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:37.762 "dma_device_type": 2 00:29:37.762 } 00:29:37.762 ], 00:29:37.762 "driver_specific": { 00:29:37.762 "raid": { 00:29:37.762 "uuid": "756492dd-1648-479e-93bb-85fde97f4d90", 00:29:37.762 "strip_size_kb": 0, 00:29:37.762 "state": "online", 00:29:37.762 "raid_level": "raid1", 00:29:37.762 "superblock": true, 00:29:37.762 "num_base_bdevs": 2, 00:29:37.762 "num_base_bdevs_discovered": 2, 00:29:37.762 "num_base_bdevs_operational": 2, 00:29:37.762 "base_bdevs_list": [ 00:29:37.762 { 00:29:37.762 "name": "BaseBdev1", 00:29:37.762 "uuid": "ede359e8-7f1a-4f2f-8140-2a6e9014d60c", 00:29:37.762 "is_configured": true, 00:29:37.762 "data_offset": 256, 00:29:37.762 "data_size": 7936 00:29:37.762 }, 00:29:37.762 { 00:29:37.762 "name": "BaseBdev2", 00:29:37.762 "uuid": "d992b637-e94e-48a9-bc8b-fa787a0407aa", 00:29:37.762 "is_configured": true, 00:29:37.762 "data_offset": 256, 00:29:37.762 "data_size": 7936 00:29:37.762 } 00:29:37.762 ] 00:29:37.762 } 00:29:37.762 } 00:29:37.762 }' 00:29:37.763 20:43:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:37.763 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:37.763 BaseBdev2' 00:29:37.763 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:37.763 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:37.763 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:38.020 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:38.020 "name": "BaseBdev1", 00:29:38.020 "aliases": [ 00:29:38.020 "ede359e8-7f1a-4f2f-8140-2a6e9014d60c" 00:29:38.020 ], 00:29:38.020 "product_name": "Malloc disk", 00:29:38.020 "block_size": 4128, 00:29:38.020 "num_blocks": 8192, 00:29:38.020 "uuid": "ede359e8-7f1a-4f2f-8140-2a6e9014d60c", 00:29:38.020 "md_size": 32, 00:29:38.020 "md_interleave": true, 00:29:38.020 "dif_type": 0, 00:29:38.020 "assigned_rate_limits": { 00:29:38.020 "rw_ios_per_sec": 0, 00:29:38.020 "rw_mbytes_per_sec": 0, 00:29:38.020 "r_mbytes_per_sec": 0, 00:29:38.020 "w_mbytes_per_sec": 0 00:29:38.020 }, 00:29:38.020 "claimed": true, 00:29:38.020 "claim_type": "exclusive_write", 00:29:38.020 "zoned": false, 00:29:38.020 "supported_io_types": { 00:29:38.020 "read": true, 00:29:38.020 "write": true, 00:29:38.020 "unmap": true, 00:29:38.020 "flush": true, 00:29:38.020 "reset": true, 00:29:38.020 "nvme_admin": false, 00:29:38.020 "nvme_io": false, 00:29:38.020 "nvme_io_md": false, 00:29:38.020 "write_zeroes": true, 00:29:38.020 "zcopy": true, 00:29:38.020 "get_zone_info": false, 00:29:38.020 "zone_management": false, 00:29:38.020 "zone_append": false, 00:29:38.020 "compare": false, 00:29:38.020 "compare_and_write": false, 00:29:38.020 "abort": true, 00:29:38.020 "seek_hole": false, 00:29:38.020 "seek_data": false, 00:29:38.020 "copy": true, 00:29:38.020 "nvme_iov_md": false 00:29:38.020 }, 00:29:38.020 "memory_domains": [ 00:29:38.020 { 00:29:38.020 "dma_device_id": "system", 00:29:38.020 "dma_device_type": 1 00:29:38.020 }, 00:29:38.020 { 00:29:38.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:38.020 "dma_device_type": 2 00:29:38.020 } 00:29:38.020 ], 00:29:38.020 "driver_specific": {} 00:29:38.020 }' 00:29:38.020 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:38.020 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:38.020 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:38.020 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:38.020 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:38.278 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:38.278 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:38.278 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:38.278 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:38.278 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:38.278 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:38.278 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:38.278 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:38.278 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:38.278 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:38.536 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:38.536 "name": "BaseBdev2", 00:29:38.536 "aliases": [ 00:29:38.536 "d992b637-e94e-48a9-bc8b-fa787a0407aa" 00:29:38.536 ], 00:29:38.536 "product_name": "Malloc disk", 00:29:38.536 "block_size": 4128, 00:29:38.536 "num_blocks": 8192, 00:29:38.536 "uuid": "d992b637-e94e-48a9-bc8b-fa787a0407aa", 00:29:38.536 "md_size": 32, 00:29:38.536 "md_interleave": true, 00:29:38.536 "dif_type": 0, 00:29:38.536 "assigned_rate_limits": { 00:29:38.536 "rw_ios_per_sec": 0, 00:29:38.536 "rw_mbytes_per_sec": 0, 00:29:38.536 "r_mbytes_per_sec": 0, 00:29:38.536 "w_mbytes_per_sec": 0 00:29:38.536 }, 00:29:38.536 "claimed": true, 00:29:38.536 "claim_type": "exclusive_write", 00:29:38.536 "zoned": false, 00:29:38.536 "supported_io_types": { 00:29:38.536 "read": true, 00:29:38.536 "write": true, 00:29:38.536 "unmap": true, 00:29:38.536 "flush": true, 00:29:38.536 "reset": true, 00:29:38.537 "nvme_admin": false, 00:29:38.537 "nvme_io": false, 00:29:38.537 "nvme_io_md": false, 00:29:38.537 "write_zeroes": true, 00:29:38.537 "zcopy": true, 00:29:38.537 "get_zone_info": false, 00:29:38.537 "zone_management": false, 00:29:38.537 "zone_append": false, 00:29:38.537 "compare": false, 00:29:38.537 "compare_and_write": false, 00:29:38.537 "abort": true, 00:29:38.537 "seek_hole": false, 00:29:38.537 "seek_data": false, 00:29:38.537 "copy": true, 00:29:38.537 "nvme_iov_md": false 00:29:38.537 }, 00:29:38.537 "memory_domains": [ 00:29:38.537 { 00:29:38.537 "dma_device_id": "system", 00:29:38.537 "dma_device_type": 1 00:29:38.537 }, 00:29:38.537 { 00:29:38.537 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:38.537 "dma_device_type": 2 00:29:38.537 } 00:29:38.537 ], 00:29:38.537 "driver_specific": {} 00:29:38.537 }' 00:29:38.537 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:38.537 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:38.794 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:38.794 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:38.794 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:38.794 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:38.794 20:43:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:38.794 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:38.794 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:38.794 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:38.794 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:38.794 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:38.794 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:39.052 [2024-07-15 20:43:31.387230] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.052 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:39.311 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:39.311 "name": "Existed_Raid", 00:29:39.311 "uuid": "756492dd-1648-479e-93bb-85fde97f4d90", 00:29:39.311 "strip_size_kb": 0, 00:29:39.311 "state": "online", 00:29:39.311 "raid_level": "raid1", 00:29:39.311 "superblock": true, 00:29:39.311 "num_base_bdevs": 2, 00:29:39.311 "num_base_bdevs_discovered": 1, 00:29:39.311 "num_base_bdevs_operational": 1, 00:29:39.311 "base_bdevs_list": [ 00:29:39.311 { 00:29:39.311 "name": null, 00:29:39.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:39.311 "is_configured": false, 00:29:39.311 "data_offset": 256, 00:29:39.311 "data_size": 7936 00:29:39.311 }, 00:29:39.311 { 00:29:39.311 "name": "BaseBdev2", 00:29:39.311 "uuid": "d992b637-e94e-48a9-bc8b-fa787a0407aa", 00:29:39.311 "is_configured": true, 00:29:39.311 "data_offset": 256, 00:29:39.311 "data_size": 7936 00:29:39.311 } 00:29:39.311 ] 00:29:39.311 }' 00:29:39.311 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:39.311 20:43:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:39.876 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:39.876 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:39.876 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.876 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:40.135 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:40.135 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:40.135 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:40.393 [2024-07-15 20:43:32.716868] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:40.393 [2024-07-15 20:43:32.716966] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:40.393 [2024-07-15 20:43:32.728319] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:40.393 [2024-07-15 20:43:32.728359] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:40.393 [2024-07-15 20:43:32.728371] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17eb180 name Existed_Raid, state offline 00:29:40.393 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:40.393 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:40.393 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.393 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:40.651 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:40.651 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:40.651 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:40.651 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1510151 00:29:40.651 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1510151 ']' 00:29:40.651 20:43:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1510151 00:29:40.651 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:40.651 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:40.651 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1510151 00:29:40.910 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:40.910 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:40.910 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1510151' 00:29:40.910 killing process with pid 1510151 00:29:40.910 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 1510151 00:29:40.910 [2024-07-15 20:43:33.046240] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:40.910 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 1510151 00:29:40.911 [2024-07-15 20:43:33.047171] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:40.911 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:29:40.911 00:29:40.911 real 0m11.254s 00:29:40.911 user 0m20.021s 00:29:40.911 sys 0m2.137s 00:29:40.911 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:40.911 20:43:33 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:40.911 ************************************ 00:29:40.911 END TEST raid_state_function_test_sb_md_interleaved 00:29:40.911 ************************************ 00:29:41.168 20:43:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:41.168 20:43:33 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:29:41.169 20:43:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:29:41.169 20:43:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:41.169 20:43:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:41.169 ************************************ 00:29:41.169 START TEST raid_superblock_test_md_interleaved 00:29:41.169 ************************************ 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=1511783 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 1511783 /var/tmp/spdk-raid.sock 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1511783 ']' 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:41.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:41.169 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:41.169 [2024-07-15 20:43:33.416538] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:29:41.169 [2024-07-15 20:43:33.416601] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1511783 ] 00:29:41.169 [2024-07-15 20:43:33.544123] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:41.426 [2024-07-15 20:43:33.653437] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:41.426 [2024-07-15 20:43:33.717868] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:41.426 [2024-07-15 20:43:33.717898] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:41.684 20:43:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:29:41.941 malloc1 00:29:41.941 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:42.198 [2024-07-15 20:43:34.366063] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:42.198 [2024-07-15 20:43:34.366113] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:42.198 [2024-07-15 20:43:34.366134] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22cd4e0 00:29:42.198 [2024-07-15 20:43:34.366147] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:42.198 [2024-07-15 20:43:34.367672] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:42.198 [2024-07-15 20:43:34.367699] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:42.198 pt1 00:29:42.198 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:42.198 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:42.198 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:29:42.198 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:29:42.198 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:42.198 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:42.198 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:42.198 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:42.198 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:29:42.455 malloc2 00:29:42.455 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:42.713 [2024-07-15 20:43:34.852363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:42.713 [2024-07-15 20:43:34.852409] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:42.713 [2024-07-15 20:43:34.852429] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b2570 00:29:42.713 [2024-07-15 20:43:34.852442] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:42.713 [2024-07-15 20:43:34.853881] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:42.713 [2024-07-15 20:43:34.853907] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:42.713 pt2 00:29:42.713 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:42.713 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:42.713 20:43:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:42.971 [2024-07-15 20:43:35.097030] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:42.971 [2024-07-15 20:43:35.098541] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:42.971 [2024-07-15 20:43:35.098693] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22b3f20 00:29:42.971 [2024-07-15 20:43:35.098708] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:42.971 [2024-07-15 20:43:35.098778] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2130050 00:29:42.971 [2024-07-15 20:43:35.098861] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22b3f20 00:29:42.971 [2024-07-15 20:43:35.098871] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22b3f20 00:29:42.971 [2024-07-15 20:43:35.098943] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:42.971 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:42.971 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:42.971 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:42.972 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:42.972 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:42.972 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:42.972 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:42.972 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:42.972 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:42.972 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:42.972 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.972 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:43.229 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:43.229 "name": "raid_bdev1", 00:29:43.229 "uuid": "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718", 00:29:43.229 "strip_size_kb": 0, 00:29:43.229 "state": "online", 00:29:43.229 "raid_level": "raid1", 00:29:43.229 "superblock": true, 00:29:43.229 "num_base_bdevs": 2, 00:29:43.229 "num_base_bdevs_discovered": 2, 00:29:43.229 "num_base_bdevs_operational": 2, 00:29:43.229 "base_bdevs_list": [ 00:29:43.229 { 00:29:43.229 "name": "pt1", 00:29:43.229 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:43.229 "is_configured": true, 00:29:43.229 "data_offset": 256, 00:29:43.229 "data_size": 7936 00:29:43.229 }, 00:29:43.229 { 00:29:43.229 "name": "pt2", 00:29:43.229 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:43.229 "is_configured": true, 00:29:43.229 "data_offset": 256, 00:29:43.229 "data_size": 7936 00:29:43.229 } 00:29:43.229 ] 00:29:43.229 }' 00:29:43.229 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:43.229 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:43.836 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:29:43.836 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:43.837 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:43.837 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:43.837 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:43.837 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:43.837 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:43.837 20:43:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:44.128 [2024-07-15 20:43:36.180156] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:44.128 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:44.128 "name": "raid_bdev1", 00:29:44.128 "aliases": [ 00:29:44.128 "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718" 00:29:44.128 ], 00:29:44.128 "product_name": "Raid Volume", 00:29:44.128 "block_size": 4128, 00:29:44.128 "num_blocks": 7936, 00:29:44.128 "uuid": "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718", 00:29:44.128 "md_size": 32, 00:29:44.128 "md_interleave": true, 00:29:44.128 "dif_type": 0, 00:29:44.128 "assigned_rate_limits": { 00:29:44.128 "rw_ios_per_sec": 0, 00:29:44.128 "rw_mbytes_per_sec": 0, 00:29:44.128 "r_mbytes_per_sec": 0, 00:29:44.128 "w_mbytes_per_sec": 0 00:29:44.128 }, 00:29:44.128 "claimed": false, 00:29:44.128 "zoned": false, 00:29:44.128 "supported_io_types": { 00:29:44.128 "read": true, 00:29:44.128 "write": true, 00:29:44.128 "unmap": false, 00:29:44.128 "flush": false, 00:29:44.128 "reset": true, 00:29:44.128 "nvme_admin": false, 00:29:44.128 "nvme_io": false, 00:29:44.128 "nvme_io_md": false, 00:29:44.128 "write_zeroes": true, 00:29:44.128 "zcopy": false, 00:29:44.128 "get_zone_info": false, 00:29:44.128 "zone_management": false, 00:29:44.128 "zone_append": false, 00:29:44.128 "compare": false, 00:29:44.128 "compare_and_write": false, 00:29:44.128 "abort": false, 00:29:44.128 "seek_hole": false, 00:29:44.128 "seek_data": false, 00:29:44.128 "copy": false, 00:29:44.128 "nvme_iov_md": false 00:29:44.128 }, 00:29:44.128 "memory_domains": [ 00:29:44.128 { 00:29:44.128 "dma_device_id": "system", 00:29:44.128 "dma_device_type": 1 00:29:44.128 }, 00:29:44.128 { 00:29:44.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:44.128 "dma_device_type": 2 00:29:44.128 }, 00:29:44.128 { 00:29:44.128 "dma_device_id": "system", 00:29:44.128 "dma_device_type": 1 00:29:44.128 }, 00:29:44.128 { 00:29:44.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:44.128 "dma_device_type": 2 00:29:44.128 } 00:29:44.128 ], 00:29:44.128 "driver_specific": { 00:29:44.128 "raid": { 00:29:44.128 "uuid": "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718", 00:29:44.128 "strip_size_kb": 0, 00:29:44.128 "state": "online", 00:29:44.128 "raid_level": "raid1", 00:29:44.128 "superblock": true, 00:29:44.128 "num_base_bdevs": 2, 00:29:44.128 "num_base_bdevs_discovered": 2, 00:29:44.128 "num_base_bdevs_operational": 2, 00:29:44.128 "base_bdevs_list": [ 00:29:44.128 { 00:29:44.128 "name": "pt1", 00:29:44.128 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:44.128 "is_configured": true, 00:29:44.128 "data_offset": 256, 00:29:44.128 "data_size": 7936 00:29:44.128 }, 00:29:44.128 { 00:29:44.128 "name": "pt2", 00:29:44.128 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:44.128 "is_configured": true, 00:29:44.128 "data_offset": 256, 00:29:44.128 "data_size": 7936 00:29:44.128 } 00:29:44.128 ] 00:29:44.128 } 00:29:44.128 } 00:29:44.128 }' 00:29:44.128 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:44.128 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:44.128 pt2' 00:29:44.128 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:44.128 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:44.128 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:44.128 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:44.128 "name": "pt1", 00:29:44.128 "aliases": [ 00:29:44.128 "00000000-0000-0000-0000-000000000001" 00:29:44.128 ], 00:29:44.128 "product_name": "passthru", 00:29:44.128 "block_size": 4128, 00:29:44.128 "num_blocks": 8192, 00:29:44.128 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:44.128 "md_size": 32, 00:29:44.128 "md_interleave": true, 00:29:44.128 "dif_type": 0, 00:29:44.128 "assigned_rate_limits": { 00:29:44.128 "rw_ios_per_sec": 0, 00:29:44.128 "rw_mbytes_per_sec": 0, 00:29:44.128 "r_mbytes_per_sec": 0, 00:29:44.128 "w_mbytes_per_sec": 0 00:29:44.128 }, 00:29:44.128 "claimed": true, 00:29:44.128 "claim_type": "exclusive_write", 00:29:44.128 "zoned": false, 00:29:44.128 "supported_io_types": { 00:29:44.128 "read": true, 00:29:44.128 "write": true, 00:29:44.128 "unmap": true, 00:29:44.128 "flush": true, 00:29:44.128 "reset": true, 00:29:44.128 "nvme_admin": false, 00:29:44.128 "nvme_io": false, 00:29:44.128 "nvme_io_md": false, 00:29:44.128 "write_zeroes": true, 00:29:44.128 "zcopy": true, 00:29:44.128 "get_zone_info": false, 00:29:44.128 "zone_management": false, 00:29:44.128 "zone_append": false, 00:29:44.128 "compare": false, 00:29:44.128 "compare_and_write": false, 00:29:44.128 "abort": true, 00:29:44.128 "seek_hole": false, 00:29:44.128 "seek_data": false, 00:29:44.128 "copy": true, 00:29:44.128 "nvme_iov_md": false 00:29:44.128 }, 00:29:44.128 "memory_domains": [ 00:29:44.128 { 00:29:44.128 "dma_device_id": "system", 00:29:44.128 "dma_device_type": 1 00:29:44.128 }, 00:29:44.128 { 00:29:44.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:44.128 "dma_device_type": 2 00:29:44.128 } 00:29:44.128 ], 00:29:44.128 "driver_specific": { 00:29:44.128 "passthru": { 00:29:44.128 "name": "pt1", 00:29:44.128 "base_bdev_name": "malloc1" 00:29:44.128 } 00:29:44.128 } 00:29:44.128 }' 00:29:44.128 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:44.385 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:44.385 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:44.385 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:44.385 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:44.385 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:44.385 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:44.385 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:44.385 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:44.385 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:44.644 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:44.644 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:44.644 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:44.644 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:44.644 20:43:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:44.902 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:44.902 "name": "pt2", 00:29:44.902 "aliases": [ 00:29:44.902 "00000000-0000-0000-0000-000000000002" 00:29:44.902 ], 00:29:44.902 "product_name": "passthru", 00:29:44.902 "block_size": 4128, 00:29:44.902 "num_blocks": 8192, 00:29:44.902 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:44.902 "md_size": 32, 00:29:44.902 "md_interleave": true, 00:29:44.902 "dif_type": 0, 00:29:44.902 "assigned_rate_limits": { 00:29:44.902 "rw_ios_per_sec": 0, 00:29:44.902 "rw_mbytes_per_sec": 0, 00:29:44.902 "r_mbytes_per_sec": 0, 00:29:44.902 "w_mbytes_per_sec": 0 00:29:44.902 }, 00:29:44.902 "claimed": true, 00:29:44.902 "claim_type": "exclusive_write", 00:29:44.902 "zoned": false, 00:29:44.902 "supported_io_types": { 00:29:44.902 "read": true, 00:29:44.902 "write": true, 00:29:44.902 "unmap": true, 00:29:44.902 "flush": true, 00:29:44.902 "reset": true, 00:29:44.902 "nvme_admin": false, 00:29:44.902 "nvme_io": false, 00:29:44.902 "nvme_io_md": false, 00:29:44.902 "write_zeroes": true, 00:29:44.902 "zcopy": true, 00:29:44.902 "get_zone_info": false, 00:29:44.902 "zone_management": false, 00:29:44.902 "zone_append": false, 00:29:44.902 "compare": false, 00:29:44.902 "compare_and_write": false, 00:29:44.902 "abort": true, 00:29:44.902 "seek_hole": false, 00:29:44.902 "seek_data": false, 00:29:44.902 "copy": true, 00:29:44.902 "nvme_iov_md": false 00:29:44.902 }, 00:29:44.902 "memory_domains": [ 00:29:44.902 { 00:29:44.902 "dma_device_id": "system", 00:29:44.902 "dma_device_type": 1 00:29:44.902 }, 00:29:44.902 { 00:29:44.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:44.902 "dma_device_type": 2 00:29:44.902 } 00:29:44.902 ], 00:29:44.902 "driver_specific": { 00:29:44.902 "passthru": { 00:29:44.902 "name": "pt2", 00:29:44.902 "base_bdev_name": "malloc2" 00:29:44.902 } 00:29:44.902 } 00:29:44.902 }' 00:29:44.903 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:44.903 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:44.903 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:44.903 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:44.903 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:44.903 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:44.903 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:45.160 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:45.160 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:45.160 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:45.160 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:45.160 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:45.160 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:45.160 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:29:45.418 [2024-07-15 20:43:37.688170] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:45.418 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=83bc3ad9-8f2c-4fe7-a6e3-d455229c9718 00:29:45.418 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 83bc3ad9-8f2c-4fe7-a6e3-d455229c9718 ']' 00:29:45.418 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:45.676 [2024-07-15 20:43:37.932541] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:45.676 [2024-07-15 20:43:37.932567] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:45.676 [2024-07-15 20:43:37.932624] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:45.676 [2024-07-15 20:43:37.932680] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:45.676 [2024-07-15 20:43:37.932696] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b3f20 name raid_bdev1, state offline 00:29:45.676 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:29:45.676 20:43:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.934 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:29:45.934 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:29:45.934 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:45.934 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:46.192 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:46.192 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:46.450 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:46.450 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:46.709 20:43:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:46.967 [2024-07-15 20:43:39.147704] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:46.967 [2024-07-15 20:43:39.149102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:46.967 [2024-07-15 20:43:39.149158] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:46.967 [2024-07-15 20:43:39.149206] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:46.967 [2024-07-15 20:43:39.149225] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:46.967 [2024-07-15 20:43:39.149234] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22be260 name raid_bdev1, state configuring 00:29:46.967 request: 00:29:46.967 { 00:29:46.967 "name": "raid_bdev1", 00:29:46.967 "raid_level": "raid1", 00:29:46.967 "base_bdevs": [ 00:29:46.967 "malloc1", 00:29:46.967 "malloc2" 00:29:46.967 ], 00:29:46.967 "superblock": false, 00:29:46.967 "method": "bdev_raid_create", 00:29:46.967 "req_id": 1 00:29:46.967 } 00:29:46.967 Got JSON-RPC error response 00:29:46.967 response: 00:29:46.967 { 00:29:46.967 "code": -17, 00:29:46.967 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:46.967 } 00:29:46.967 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:46.967 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:46.967 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:46.967 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:46.967 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:46.967 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:29:47.225 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:29:47.225 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:29:47.225 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:47.493 [2024-07-15 20:43:39.640967] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:47.493 [2024-07-15 20:43:39.641020] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:47.493 [2024-07-15 20:43:39.641039] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b5000 00:29:47.493 [2024-07-15 20:43:39.641052] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:47.493 [2024-07-15 20:43:39.642516] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:47.493 [2024-07-15 20:43:39.642543] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:47.493 [2024-07-15 20:43:39.642593] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:47.493 [2024-07-15 20:43:39.642620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:47.493 pt1 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:47.493 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:47.751 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:47.751 "name": "raid_bdev1", 00:29:47.751 "uuid": "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718", 00:29:47.751 "strip_size_kb": 0, 00:29:47.751 "state": "configuring", 00:29:47.751 "raid_level": "raid1", 00:29:47.751 "superblock": true, 00:29:47.751 "num_base_bdevs": 2, 00:29:47.751 "num_base_bdevs_discovered": 1, 00:29:47.751 "num_base_bdevs_operational": 2, 00:29:47.751 "base_bdevs_list": [ 00:29:47.751 { 00:29:47.751 "name": "pt1", 00:29:47.751 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:47.751 "is_configured": true, 00:29:47.751 "data_offset": 256, 00:29:47.751 "data_size": 7936 00:29:47.751 }, 00:29:47.751 { 00:29:47.751 "name": null, 00:29:47.751 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:47.751 "is_configured": false, 00:29:47.751 "data_offset": 256, 00:29:47.751 "data_size": 7936 00:29:47.751 } 00:29:47.751 ] 00:29:47.751 }' 00:29:47.751 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:47.751 20:43:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:48.318 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:29:48.318 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:29:48.318 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:48.318 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:48.576 [2024-07-15 20:43:40.727850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:48.576 [2024-07-15 20:43:40.727904] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:48.576 [2024-07-15 20:43:40.727939] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b7270 00:29:48.576 [2024-07-15 20:43:40.727953] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:48.576 [2024-07-15 20:43:40.728130] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:48.576 [2024-07-15 20:43:40.728146] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:48.576 [2024-07-15 20:43:40.728195] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:48.576 [2024-07-15 20:43:40.728214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:48.576 [2024-07-15 20:43:40.728295] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2130c10 00:29:48.576 [2024-07-15 20:43:40.728306] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:48.576 [2024-07-15 20:43:40.728359] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b2d40 00:29:48.576 [2024-07-15 20:43:40.728433] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2130c10 00:29:48.576 [2024-07-15 20:43:40.728443] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2130c10 00:29:48.576 [2024-07-15 20:43:40.728501] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:48.576 pt2 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:48.576 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.834 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:48.834 "name": "raid_bdev1", 00:29:48.834 "uuid": "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718", 00:29:48.834 "strip_size_kb": 0, 00:29:48.834 "state": "online", 00:29:48.834 "raid_level": "raid1", 00:29:48.834 "superblock": true, 00:29:48.834 "num_base_bdevs": 2, 00:29:48.834 "num_base_bdevs_discovered": 2, 00:29:48.834 "num_base_bdevs_operational": 2, 00:29:48.834 "base_bdevs_list": [ 00:29:48.834 { 00:29:48.834 "name": "pt1", 00:29:48.834 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:48.834 "is_configured": true, 00:29:48.834 "data_offset": 256, 00:29:48.834 "data_size": 7936 00:29:48.834 }, 00:29:48.834 { 00:29:48.834 "name": "pt2", 00:29:48.834 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:48.834 "is_configured": true, 00:29:48.834 "data_offset": 256, 00:29:48.834 "data_size": 7936 00:29:48.834 } 00:29:48.834 ] 00:29:48.834 }' 00:29:48.834 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:48.834 20:43:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:49.400 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:29:49.400 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:49.400 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:49.400 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:49.400 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:49.400 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:49.400 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:49.400 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:49.658 [2024-07-15 20:43:41.847076] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:49.658 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:49.658 "name": "raid_bdev1", 00:29:49.658 "aliases": [ 00:29:49.658 "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718" 00:29:49.658 ], 00:29:49.658 "product_name": "Raid Volume", 00:29:49.658 "block_size": 4128, 00:29:49.658 "num_blocks": 7936, 00:29:49.658 "uuid": "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718", 00:29:49.658 "md_size": 32, 00:29:49.658 "md_interleave": true, 00:29:49.658 "dif_type": 0, 00:29:49.658 "assigned_rate_limits": { 00:29:49.658 "rw_ios_per_sec": 0, 00:29:49.658 "rw_mbytes_per_sec": 0, 00:29:49.658 "r_mbytes_per_sec": 0, 00:29:49.658 "w_mbytes_per_sec": 0 00:29:49.658 }, 00:29:49.658 "claimed": false, 00:29:49.658 "zoned": false, 00:29:49.658 "supported_io_types": { 00:29:49.658 "read": true, 00:29:49.658 "write": true, 00:29:49.658 "unmap": false, 00:29:49.658 "flush": false, 00:29:49.658 "reset": true, 00:29:49.658 "nvme_admin": false, 00:29:49.658 "nvme_io": false, 00:29:49.658 "nvme_io_md": false, 00:29:49.658 "write_zeroes": true, 00:29:49.658 "zcopy": false, 00:29:49.658 "get_zone_info": false, 00:29:49.658 "zone_management": false, 00:29:49.658 "zone_append": false, 00:29:49.658 "compare": false, 00:29:49.658 "compare_and_write": false, 00:29:49.658 "abort": false, 00:29:49.658 "seek_hole": false, 00:29:49.658 "seek_data": false, 00:29:49.658 "copy": false, 00:29:49.658 "nvme_iov_md": false 00:29:49.658 }, 00:29:49.658 "memory_domains": [ 00:29:49.658 { 00:29:49.658 "dma_device_id": "system", 00:29:49.658 "dma_device_type": 1 00:29:49.658 }, 00:29:49.658 { 00:29:49.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:49.658 "dma_device_type": 2 00:29:49.658 }, 00:29:49.658 { 00:29:49.658 "dma_device_id": "system", 00:29:49.658 "dma_device_type": 1 00:29:49.658 }, 00:29:49.658 { 00:29:49.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:49.658 "dma_device_type": 2 00:29:49.658 } 00:29:49.658 ], 00:29:49.658 "driver_specific": { 00:29:49.658 "raid": { 00:29:49.658 "uuid": "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718", 00:29:49.658 "strip_size_kb": 0, 00:29:49.658 "state": "online", 00:29:49.658 "raid_level": "raid1", 00:29:49.658 "superblock": true, 00:29:49.658 "num_base_bdevs": 2, 00:29:49.658 "num_base_bdevs_discovered": 2, 00:29:49.658 "num_base_bdevs_operational": 2, 00:29:49.658 "base_bdevs_list": [ 00:29:49.658 { 00:29:49.658 "name": "pt1", 00:29:49.658 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:49.658 "is_configured": true, 00:29:49.658 "data_offset": 256, 00:29:49.658 "data_size": 7936 00:29:49.658 }, 00:29:49.658 { 00:29:49.658 "name": "pt2", 00:29:49.658 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:49.658 "is_configured": true, 00:29:49.658 "data_offset": 256, 00:29:49.658 "data_size": 7936 00:29:49.658 } 00:29:49.658 ] 00:29:49.658 } 00:29:49.658 } 00:29:49.658 }' 00:29:49.658 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:49.658 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:49.658 pt2' 00:29:49.658 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:49.658 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:49.658 20:43:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:49.916 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:49.916 "name": "pt1", 00:29:49.916 "aliases": [ 00:29:49.916 "00000000-0000-0000-0000-000000000001" 00:29:49.916 ], 00:29:49.916 "product_name": "passthru", 00:29:49.916 "block_size": 4128, 00:29:49.916 "num_blocks": 8192, 00:29:49.916 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:49.916 "md_size": 32, 00:29:49.916 "md_interleave": true, 00:29:49.916 "dif_type": 0, 00:29:49.916 "assigned_rate_limits": { 00:29:49.916 "rw_ios_per_sec": 0, 00:29:49.916 "rw_mbytes_per_sec": 0, 00:29:49.916 "r_mbytes_per_sec": 0, 00:29:49.916 "w_mbytes_per_sec": 0 00:29:49.916 }, 00:29:49.916 "claimed": true, 00:29:49.916 "claim_type": "exclusive_write", 00:29:49.916 "zoned": false, 00:29:49.916 "supported_io_types": { 00:29:49.916 "read": true, 00:29:49.916 "write": true, 00:29:49.916 "unmap": true, 00:29:49.916 "flush": true, 00:29:49.916 "reset": true, 00:29:49.916 "nvme_admin": false, 00:29:49.916 "nvme_io": false, 00:29:49.916 "nvme_io_md": false, 00:29:49.916 "write_zeroes": true, 00:29:49.916 "zcopy": true, 00:29:49.916 "get_zone_info": false, 00:29:49.916 "zone_management": false, 00:29:49.916 "zone_append": false, 00:29:49.916 "compare": false, 00:29:49.916 "compare_and_write": false, 00:29:49.916 "abort": true, 00:29:49.916 "seek_hole": false, 00:29:49.916 "seek_data": false, 00:29:49.916 "copy": true, 00:29:49.916 "nvme_iov_md": false 00:29:49.916 }, 00:29:49.916 "memory_domains": [ 00:29:49.916 { 00:29:49.916 "dma_device_id": "system", 00:29:49.916 "dma_device_type": 1 00:29:49.916 }, 00:29:49.916 { 00:29:49.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:49.916 "dma_device_type": 2 00:29:49.916 } 00:29:49.916 ], 00:29:49.916 "driver_specific": { 00:29:49.916 "passthru": { 00:29:49.916 "name": "pt1", 00:29:49.916 "base_bdev_name": "malloc1" 00:29:49.916 } 00:29:49.916 } 00:29:49.916 }' 00:29:49.916 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:49.916 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:49.916 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:49.916 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:50.174 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:50.432 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:50.432 "name": "pt2", 00:29:50.432 "aliases": [ 00:29:50.432 "00000000-0000-0000-0000-000000000002" 00:29:50.432 ], 00:29:50.432 "product_name": "passthru", 00:29:50.432 "block_size": 4128, 00:29:50.432 "num_blocks": 8192, 00:29:50.432 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:50.432 "md_size": 32, 00:29:50.432 "md_interleave": true, 00:29:50.432 "dif_type": 0, 00:29:50.432 "assigned_rate_limits": { 00:29:50.432 "rw_ios_per_sec": 0, 00:29:50.432 "rw_mbytes_per_sec": 0, 00:29:50.432 "r_mbytes_per_sec": 0, 00:29:50.432 "w_mbytes_per_sec": 0 00:29:50.432 }, 00:29:50.432 "claimed": true, 00:29:50.432 "claim_type": "exclusive_write", 00:29:50.432 "zoned": false, 00:29:50.432 "supported_io_types": { 00:29:50.432 "read": true, 00:29:50.432 "write": true, 00:29:50.432 "unmap": true, 00:29:50.432 "flush": true, 00:29:50.432 "reset": true, 00:29:50.432 "nvme_admin": false, 00:29:50.432 "nvme_io": false, 00:29:50.432 "nvme_io_md": false, 00:29:50.432 "write_zeroes": true, 00:29:50.432 "zcopy": true, 00:29:50.432 "get_zone_info": false, 00:29:50.432 "zone_management": false, 00:29:50.432 "zone_append": false, 00:29:50.432 "compare": false, 00:29:50.432 "compare_and_write": false, 00:29:50.432 "abort": true, 00:29:50.432 "seek_hole": false, 00:29:50.432 "seek_data": false, 00:29:50.432 "copy": true, 00:29:50.432 "nvme_iov_md": false 00:29:50.432 }, 00:29:50.432 "memory_domains": [ 00:29:50.432 { 00:29:50.432 "dma_device_id": "system", 00:29:50.432 "dma_device_type": 1 00:29:50.432 }, 00:29:50.432 { 00:29:50.432 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:50.432 "dma_device_type": 2 00:29:50.432 } 00:29:50.432 ], 00:29:50.432 "driver_specific": { 00:29:50.432 "passthru": { 00:29:50.432 "name": "pt2", 00:29:50.432 "base_bdev_name": "malloc2" 00:29:50.432 } 00:29:50.432 } 00:29:50.432 }' 00:29:50.432 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:50.690 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:50.690 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:50.690 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:50.690 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:50.690 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:50.690 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:50.690 20:43:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:50.690 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:50.690 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:50.948 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:50.948 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:50.948 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:50.948 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:29:51.205 [2024-07-15 20:43:43.359076] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:51.205 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 83bc3ad9-8f2c-4fe7-a6e3-d455229c9718 '!=' 83bc3ad9-8f2c-4fe7-a6e3-d455229c9718 ']' 00:29:51.205 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:29:51.205 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:51.205 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:51.205 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:51.463 [2024-07-15 20:43:43.607483] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:51.463 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:51.721 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:51.721 "name": "raid_bdev1", 00:29:51.721 "uuid": "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718", 00:29:51.721 "strip_size_kb": 0, 00:29:51.721 "state": "online", 00:29:51.721 "raid_level": "raid1", 00:29:51.721 "superblock": true, 00:29:51.721 "num_base_bdevs": 2, 00:29:51.721 "num_base_bdevs_discovered": 1, 00:29:51.721 "num_base_bdevs_operational": 1, 00:29:51.721 "base_bdevs_list": [ 00:29:51.721 { 00:29:51.721 "name": null, 00:29:51.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:51.721 "is_configured": false, 00:29:51.721 "data_offset": 256, 00:29:51.721 "data_size": 7936 00:29:51.721 }, 00:29:51.721 { 00:29:51.721 "name": "pt2", 00:29:51.721 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:51.721 "is_configured": true, 00:29:51.721 "data_offset": 256, 00:29:51.721 "data_size": 7936 00:29:51.721 } 00:29:51.721 ] 00:29:51.721 }' 00:29:51.721 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:51.721 20:43:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:52.294 20:43:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:52.552 [2024-07-15 20:43:44.698345] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:52.552 [2024-07-15 20:43:44.698380] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:52.552 [2024-07-15 20:43:44.698435] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:52.552 [2024-07-15 20:43:44.698480] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:52.552 [2024-07-15 20:43:44.698492] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2130c10 name raid_bdev1, state offline 00:29:52.552 20:43:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.552 20:43:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:29:52.809 20:43:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:29:52.809 20:43:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:29:52.809 20:43:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:29:52.809 20:43:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:52.809 20:43:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:53.066 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:29:53.067 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:53.067 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:29:53.067 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:29:53.067 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:29:53.067 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:53.324 [2024-07-15 20:43:45.456310] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:53.324 [2024-07-15 20:43:45.456358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:53.324 [2024-07-15 20:43:45.456378] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b59f0 00:29:53.324 [2024-07-15 20:43:45.456390] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:53.324 [2024-07-15 20:43:45.457803] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:53.324 [2024-07-15 20:43:45.457829] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:53.324 [2024-07-15 20:43:45.457876] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:53.324 [2024-07-15 20:43:45.457902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:53.324 [2024-07-15 20:43:45.457977] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22b6ea0 00:29:53.324 [2024-07-15 20:43:45.457989] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:53.324 [2024-07-15 20:43:45.458047] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b4bc0 00:29:53.324 [2024-07-15 20:43:45.458119] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22b6ea0 00:29:53.324 [2024-07-15 20:43:45.458128] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22b6ea0 00:29:53.324 [2024-07-15 20:43:45.458184] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:53.324 pt2 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:53.324 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:53.324 "name": "raid_bdev1", 00:29:53.324 "uuid": "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718", 00:29:53.324 "strip_size_kb": 0, 00:29:53.324 "state": "online", 00:29:53.324 "raid_level": "raid1", 00:29:53.324 "superblock": true, 00:29:53.324 "num_base_bdevs": 2, 00:29:53.324 "num_base_bdevs_discovered": 1, 00:29:53.324 "num_base_bdevs_operational": 1, 00:29:53.324 "base_bdevs_list": [ 00:29:53.324 { 00:29:53.324 "name": null, 00:29:53.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:53.324 "is_configured": false, 00:29:53.324 "data_offset": 256, 00:29:53.324 "data_size": 7936 00:29:53.324 }, 00:29:53.324 { 00:29:53.324 "name": "pt2", 00:29:53.324 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:53.324 "is_configured": true, 00:29:53.324 "data_offset": 256, 00:29:53.324 "data_size": 7936 00:29:53.324 } 00:29:53.324 ] 00:29:53.325 }' 00:29:53.325 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:53.325 20:43:45 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:53.902 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:54.159 [2024-07-15 20:43:46.418846] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:54.159 [2024-07-15 20:43:46.418874] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:54.159 [2024-07-15 20:43:46.418932] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:54.159 [2024-07-15 20:43:46.418977] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:54.159 [2024-07-15 20:43:46.418990] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b6ea0 name raid_bdev1, state offline 00:29:54.159 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:54.159 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:29:54.416 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:29:54.416 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:29:54.416 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:29:54.416 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:54.674 [2024-07-15 20:43:46.916147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:54.674 [2024-07-15 20:43:46.916200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:54.674 [2024-07-15 20:43:46.916219] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b5620 00:29:54.674 [2024-07-15 20:43:46.916231] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:54.674 [2024-07-15 20:43:46.917661] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:54.674 [2024-07-15 20:43:46.917686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:54.674 [2024-07-15 20:43:46.917733] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:54.674 [2024-07-15 20:43:46.917759] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:54.674 [2024-07-15 20:43:46.917838] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:54.674 [2024-07-15 20:43:46.917851] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:54.674 [2024-07-15 20:43:46.917867] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b7640 name raid_bdev1, state configuring 00:29:54.674 [2024-07-15 20:43:46.917890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:54.674 [2024-07-15 20:43:46.917954] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22b7640 00:29:54.674 [2024-07-15 20:43:46.917965] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:54.674 [2024-07-15 20:43:46.918018] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22b6810 00:29:54.674 [2024-07-15 20:43:46.918097] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22b7640 00:29:54.674 [2024-07-15 20:43:46.918107] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22b7640 00:29:54.674 [2024-07-15 20:43:46.918168] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:54.674 pt1 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:54.674 20:43:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:54.932 20:43:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:54.932 "name": "raid_bdev1", 00:29:54.932 "uuid": "83bc3ad9-8f2c-4fe7-a6e3-d455229c9718", 00:29:54.932 "strip_size_kb": 0, 00:29:54.932 "state": "online", 00:29:54.932 "raid_level": "raid1", 00:29:54.932 "superblock": true, 00:29:54.932 "num_base_bdevs": 2, 00:29:54.932 "num_base_bdevs_discovered": 1, 00:29:54.932 "num_base_bdevs_operational": 1, 00:29:54.932 "base_bdevs_list": [ 00:29:54.932 { 00:29:54.932 "name": null, 00:29:54.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:54.932 "is_configured": false, 00:29:54.932 "data_offset": 256, 00:29:54.932 "data_size": 7936 00:29:54.932 }, 00:29:54.932 { 00:29:54.932 "name": "pt2", 00:29:54.932 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:54.932 "is_configured": true, 00:29:54.932 "data_offset": 256, 00:29:54.932 "data_size": 7936 00:29:54.932 } 00:29:54.932 ] 00:29:54.932 }' 00:29:54.932 20:43:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:54.932 20:43:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:55.497 20:43:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:55.497 20:43:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:55.755 20:43:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:29:55.755 20:43:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:55.755 20:43:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:29:56.014 [2024-07-15 20:43:48.187760] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 83bc3ad9-8f2c-4fe7-a6e3-d455229c9718 '!=' 83bc3ad9-8f2c-4fe7-a6e3-d455229c9718 ']' 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 1511783 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1511783 ']' 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1511783 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1511783 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1511783' 00:29:56.014 killing process with pid 1511783 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 1511783 00:29:56.014 [2024-07-15 20:43:48.262511] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:56.014 [2024-07-15 20:43:48.262566] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:56.014 [2024-07-15 20:43:48.262612] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:56.014 [2024-07-15 20:43:48.262624] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b7640 name raid_bdev1, state offline 00:29:56.014 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 1511783 00:29:56.014 [2024-07-15 20:43:48.281044] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:56.273 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:29:56.273 00:29:56.273 real 0m15.148s 00:29:56.273 user 0m27.877s 00:29:56.273 sys 0m2.895s 00:29:56.273 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:56.273 20:43:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:56.273 ************************************ 00:29:56.273 END TEST raid_superblock_test_md_interleaved 00:29:56.273 ************************************ 00:29:56.273 20:43:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:56.273 20:43:48 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:29:56.273 20:43:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:56.273 20:43:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:56.273 20:43:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:56.273 ************************************ 00:29:56.273 START TEST raid_rebuild_test_sb_md_interleaved 00:29:56.273 ************************************ 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=1514032 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 1514032 /var/tmp/spdk-raid.sock 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1514032 ']' 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:56.273 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:56.273 20:43:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:56.531 [2024-07-15 20:43:48.669499] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:29:56.531 [2024-07-15 20:43:48.669571] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1514032 ] 00:29:56.531 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:56.531 Zero copy mechanism will not be used. 00:29:56.531 [2024-07-15 20:43:48.799434] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:56.531 [2024-07-15 20:43:48.896466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:56.789 [2024-07-15 20:43:48.954096] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:56.789 [2024-07-15 20:43:48.954129] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:57.354 20:43:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:57.354 20:43:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:57.354 20:43:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:57.354 20:43:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:29:57.612 BaseBdev1_malloc 00:29:57.612 20:43:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:57.870 [2024-07-15 20:43:50.090333] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:57.870 [2024-07-15 20:43:50.090388] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:57.870 [2024-07-15 20:43:50.090413] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27f3ce0 00:29:57.870 [2024-07-15 20:43:50.090426] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:57.870 [2024-07-15 20:43:50.091998] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:57.870 [2024-07-15 20:43:50.092027] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:57.870 BaseBdev1 00:29:57.870 20:43:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:57.870 20:43:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:29:58.129 BaseBdev2_malloc 00:29:58.129 20:43:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:58.443 [2024-07-15 20:43:50.524632] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:58.443 [2024-07-15 20:43:50.524685] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:58.443 [2024-07-15 20:43:50.524708] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27eb2d0 00:29:58.443 [2024-07-15 20:43:50.524721] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:58.443 [2024-07-15 20:43:50.526441] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:58.443 [2024-07-15 20:43:50.526469] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:58.443 BaseBdev2 00:29:58.443 20:43:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:29:58.443 spare_malloc 00:29:58.443 20:43:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:58.709 spare_delay 00:29:58.709 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:58.972 [2024-07-15 20:43:51.275558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:58.972 [2024-07-15 20:43:51.275603] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:58.972 [2024-07-15 20:43:51.275625] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27ee070 00:29:58.972 [2024-07-15 20:43:51.275637] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:58.972 [2024-07-15 20:43:51.276935] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:58.972 [2024-07-15 20:43:51.276961] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:58.972 spare 00:29:58.972 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:59.229 [2024-07-15 20:43:51.524340] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:59.229 [2024-07-15 20:43:51.525522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:59.229 [2024-07-15 20:43:51.525683] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27f0370 00:29:59.229 [2024-07-15 20:43:51.525700] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:59.229 [2024-07-15 20:43:51.525772] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26569c0 00:29:59.229 [2024-07-15 20:43:51.525854] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27f0370 00:29:59.229 [2024-07-15 20:43:51.525864] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27f0370 00:29:59.229 [2024-07-15 20:43:51.525917] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:59.229 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:59.487 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:59.487 "name": "raid_bdev1", 00:29:59.487 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:29:59.487 "strip_size_kb": 0, 00:29:59.487 "state": "online", 00:29:59.487 "raid_level": "raid1", 00:29:59.487 "superblock": true, 00:29:59.487 "num_base_bdevs": 2, 00:29:59.487 "num_base_bdevs_discovered": 2, 00:29:59.487 "num_base_bdevs_operational": 2, 00:29:59.487 "base_bdevs_list": [ 00:29:59.487 { 00:29:59.487 "name": "BaseBdev1", 00:29:59.487 "uuid": "87d74c92-741c-5a62-8782-1276364101fc", 00:29:59.487 "is_configured": true, 00:29:59.487 "data_offset": 256, 00:29:59.487 "data_size": 7936 00:29:59.487 }, 00:29:59.487 { 00:29:59.487 "name": "BaseBdev2", 00:29:59.487 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:29:59.487 "is_configured": true, 00:29:59.487 "data_offset": 256, 00:29:59.487 "data_size": 7936 00:29:59.487 } 00:29:59.487 ] 00:29:59.487 }' 00:29:59.487 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:59.487 20:43:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:00.052 20:43:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:00.052 20:43:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:30:00.310 [2024-07-15 20:43:52.619501] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:00.310 20:43:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:30:00.310 20:43:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:00.310 20:43:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:30:00.567 20:43:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:30:00.567 20:43:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:30:00.567 20:43:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:30:00.567 20:43:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:30:00.824 [2024-07-15 20:43:53.124571] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:00.824 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:00.824 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:00.824 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:00.824 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:00.825 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:00.825 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:00.825 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:00.825 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:00.825 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:00.825 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:00.825 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:00.825 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:01.082 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:01.082 "name": "raid_bdev1", 00:30:01.082 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:01.082 "strip_size_kb": 0, 00:30:01.082 "state": "online", 00:30:01.082 "raid_level": "raid1", 00:30:01.082 "superblock": true, 00:30:01.082 "num_base_bdevs": 2, 00:30:01.082 "num_base_bdevs_discovered": 1, 00:30:01.082 "num_base_bdevs_operational": 1, 00:30:01.082 "base_bdevs_list": [ 00:30:01.082 { 00:30:01.082 "name": null, 00:30:01.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:01.082 "is_configured": false, 00:30:01.082 "data_offset": 256, 00:30:01.082 "data_size": 7936 00:30:01.082 }, 00:30:01.082 { 00:30:01.082 "name": "BaseBdev2", 00:30:01.082 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:01.082 "is_configured": true, 00:30:01.082 "data_offset": 256, 00:30:01.082 "data_size": 7936 00:30:01.082 } 00:30:01.082 ] 00:30:01.082 }' 00:30:01.082 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:01.082 20:43:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:01.646 20:43:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:01.903 [2024-07-15 20:43:54.231542] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:01.903 [2024-07-15 20:43:54.235205] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27f0250 00:30:01.903 [2024-07-15 20:43:54.237211] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:01.903 20:43:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:03.272 "name": "raid_bdev1", 00:30:03.272 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:03.272 "strip_size_kb": 0, 00:30:03.272 "state": "online", 00:30:03.272 "raid_level": "raid1", 00:30:03.272 "superblock": true, 00:30:03.272 "num_base_bdevs": 2, 00:30:03.272 "num_base_bdevs_discovered": 2, 00:30:03.272 "num_base_bdevs_operational": 2, 00:30:03.272 "process": { 00:30:03.272 "type": "rebuild", 00:30:03.272 "target": "spare", 00:30:03.272 "progress": { 00:30:03.272 "blocks": 2816, 00:30:03.272 "percent": 35 00:30:03.272 } 00:30:03.272 }, 00:30:03.272 "base_bdevs_list": [ 00:30:03.272 { 00:30:03.272 "name": "spare", 00:30:03.272 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:03.272 "is_configured": true, 00:30:03.272 "data_offset": 256, 00:30:03.272 "data_size": 7936 00:30:03.272 }, 00:30:03.272 { 00:30:03.272 "name": "BaseBdev2", 00:30:03.272 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:03.272 "is_configured": true, 00:30:03.272 "data_offset": 256, 00:30:03.272 "data_size": 7936 00:30:03.272 } 00:30:03.272 ] 00:30:03.272 }' 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:03.272 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:03.530 [2024-07-15 20:43:55.761938] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:03.530 [2024-07-15 20:43:55.849966] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:03.530 [2024-07-15 20:43:55.850013] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:03.530 [2024-07-15 20:43:55.850029] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:03.530 [2024-07-15 20:43:55.850037] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:03.530 20:43:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:03.788 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:03.788 "name": "raid_bdev1", 00:30:03.788 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:03.788 "strip_size_kb": 0, 00:30:03.788 "state": "online", 00:30:03.788 "raid_level": "raid1", 00:30:03.788 "superblock": true, 00:30:03.788 "num_base_bdevs": 2, 00:30:03.788 "num_base_bdevs_discovered": 1, 00:30:03.788 "num_base_bdevs_operational": 1, 00:30:03.788 "base_bdevs_list": [ 00:30:03.788 { 00:30:03.788 "name": null, 00:30:03.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:03.788 "is_configured": false, 00:30:03.788 "data_offset": 256, 00:30:03.788 "data_size": 7936 00:30:03.788 }, 00:30:03.788 { 00:30:03.788 "name": "BaseBdev2", 00:30:03.788 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:03.788 "is_configured": true, 00:30:03.788 "data_offset": 256, 00:30:03.788 "data_size": 7936 00:30:03.788 } 00:30:03.788 ] 00:30:03.788 }' 00:30:03.788 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:03.788 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:04.353 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:04.353 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:04.353 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:04.353 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:04.353 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:04.353 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.353 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:04.610 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:04.610 "name": "raid_bdev1", 00:30:04.610 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:04.610 "strip_size_kb": 0, 00:30:04.610 "state": "online", 00:30:04.611 "raid_level": "raid1", 00:30:04.611 "superblock": true, 00:30:04.611 "num_base_bdevs": 2, 00:30:04.611 "num_base_bdevs_discovered": 1, 00:30:04.611 "num_base_bdevs_operational": 1, 00:30:04.611 "base_bdevs_list": [ 00:30:04.611 { 00:30:04.611 "name": null, 00:30:04.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:04.611 "is_configured": false, 00:30:04.611 "data_offset": 256, 00:30:04.611 "data_size": 7936 00:30:04.611 }, 00:30:04.611 { 00:30:04.611 "name": "BaseBdev2", 00:30:04.611 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:04.611 "is_configured": true, 00:30:04.611 "data_offset": 256, 00:30:04.611 "data_size": 7936 00:30:04.611 } 00:30:04.611 ] 00:30:04.611 }' 00:30:04.611 20:43:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:04.868 20:43:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:04.868 20:43:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:04.868 20:43:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:04.868 20:43:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:05.126 [2024-07-15 20:43:57.318209] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:05.126 [2024-07-15 20:43:57.321825] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27ec270 00:30:05.126 [2024-07-15 20:43:57.323269] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:05.126 20:43:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:30:06.066 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:06.066 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:06.066 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:06.066 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:06.066 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:06.066 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:06.066 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:06.323 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:06.323 "name": "raid_bdev1", 00:30:06.323 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:06.323 "strip_size_kb": 0, 00:30:06.323 "state": "online", 00:30:06.323 "raid_level": "raid1", 00:30:06.323 "superblock": true, 00:30:06.323 "num_base_bdevs": 2, 00:30:06.323 "num_base_bdevs_discovered": 2, 00:30:06.323 "num_base_bdevs_operational": 2, 00:30:06.323 "process": { 00:30:06.323 "type": "rebuild", 00:30:06.323 "target": "spare", 00:30:06.323 "progress": { 00:30:06.323 "blocks": 3072, 00:30:06.323 "percent": 38 00:30:06.323 } 00:30:06.323 }, 00:30:06.323 "base_bdevs_list": [ 00:30:06.323 { 00:30:06.323 "name": "spare", 00:30:06.323 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:06.323 "is_configured": true, 00:30:06.323 "data_offset": 256, 00:30:06.323 "data_size": 7936 00:30:06.323 }, 00:30:06.323 { 00:30:06.323 "name": "BaseBdev2", 00:30:06.323 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:06.323 "is_configured": true, 00:30:06.323 "data_offset": 256, 00:30:06.323 "data_size": 7936 00:30:06.323 } 00:30:06.323 ] 00:30:06.323 }' 00:30:06.323 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:30:06.324 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1176 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:06.324 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:06.581 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:06.581 "name": "raid_bdev1", 00:30:06.581 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:06.581 "strip_size_kb": 0, 00:30:06.581 "state": "online", 00:30:06.581 "raid_level": "raid1", 00:30:06.581 "superblock": true, 00:30:06.581 "num_base_bdevs": 2, 00:30:06.581 "num_base_bdevs_discovered": 2, 00:30:06.581 "num_base_bdevs_operational": 2, 00:30:06.581 "process": { 00:30:06.581 "type": "rebuild", 00:30:06.581 "target": "spare", 00:30:06.581 "progress": { 00:30:06.581 "blocks": 3840, 00:30:06.581 "percent": 48 00:30:06.581 } 00:30:06.581 }, 00:30:06.581 "base_bdevs_list": [ 00:30:06.581 { 00:30:06.581 "name": "spare", 00:30:06.581 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:06.581 "is_configured": true, 00:30:06.581 "data_offset": 256, 00:30:06.581 "data_size": 7936 00:30:06.581 }, 00:30:06.581 { 00:30:06.581 "name": "BaseBdev2", 00:30:06.581 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:06.581 "is_configured": true, 00:30:06.581 "data_offset": 256, 00:30:06.581 "data_size": 7936 00:30:06.581 } 00:30:06.581 ] 00:30:06.581 }' 00:30:06.581 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:06.839 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:06.839 20:43:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:06.839 20:43:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:06.839 20:43:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:07.772 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:07.772 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:07.772 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:07.772 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:07.773 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:07.773 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:07.773 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:07.773 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:08.031 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:08.031 "name": "raid_bdev1", 00:30:08.031 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:08.031 "strip_size_kb": 0, 00:30:08.031 "state": "online", 00:30:08.031 "raid_level": "raid1", 00:30:08.031 "superblock": true, 00:30:08.031 "num_base_bdevs": 2, 00:30:08.031 "num_base_bdevs_discovered": 2, 00:30:08.031 "num_base_bdevs_operational": 2, 00:30:08.031 "process": { 00:30:08.031 "type": "rebuild", 00:30:08.031 "target": "spare", 00:30:08.031 "progress": { 00:30:08.031 "blocks": 7424, 00:30:08.031 "percent": 93 00:30:08.031 } 00:30:08.031 }, 00:30:08.031 "base_bdevs_list": [ 00:30:08.031 { 00:30:08.031 "name": "spare", 00:30:08.031 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:08.031 "is_configured": true, 00:30:08.031 "data_offset": 256, 00:30:08.031 "data_size": 7936 00:30:08.031 }, 00:30:08.031 { 00:30:08.031 "name": "BaseBdev2", 00:30:08.031 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:08.031 "is_configured": true, 00:30:08.031 "data_offset": 256, 00:30:08.031 "data_size": 7936 00:30:08.031 } 00:30:08.031 ] 00:30:08.031 }' 00:30:08.031 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:08.031 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:08.031 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:08.031 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:08.031 20:44:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:08.289 [2024-07-15 20:44:00.447165] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:30:08.289 [2024-07-15 20:44:00.447228] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:30:08.289 [2024-07-15 20:44:00.447322] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:09.223 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:09.223 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:09.223 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:09.223 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:09.223 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:09.223 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:09.223 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.223 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:09.481 "name": "raid_bdev1", 00:30:09.481 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:09.481 "strip_size_kb": 0, 00:30:09.481 "state": "online", 00:30:09.481 "raid_level": "raid1", 00:30:09.481 "superblock": true, 00:30:09.481 "num_base_bdevs": 2, 00:30:09.481 "num_base_bdevs_discovered": 2, 00:30:09.481 "num_base_bdevs_operational": 2, 00:30:09.481 "base_bdevs_list": [ 00:30:09.481 { 00:30:09.481 "name": "spare", 00:30:09.481 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:09.481 "is_configured": true, 00:30:09.481 "data_offset": 256, 00:30:09.481 "data_size": 7936 00:30:09.481 }, 00:30:09.481 { 00:30:09.481 "name": "BaseBdev2", 00:30:09.481 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:09.481 "is_configured": true, 00:30:09.481 "data_offset": 256, 00:30:09.481 "data_size": 7936 00:30:09.481 } 00:30:09.481 ] 00:30:09.481 }' 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.481 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:09.739 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:09.739 "name": "raid_bdev1", 00:30:09.739 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:09.739 "strip_size_kb": 0, 00:30:09.739 "state": "online", 00:30:09.739 "raid_level": "raid1", 00:30:09.739 "superblock": true, 00:30:09.739 "num_base_bdevs": 2, 00:30:09.739 "num_base_bdevs_discovered": 2, 00:30:09.739 "num_base_bdevs_operational": 2, 00:30:09.739 "base_bdevs_list": [ 00:30:09.739 { 00:30:09.739 "name": "spare", 00:30:09.739 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:09.739 "is_configured": true, 00:30:09.739 "data_offset": 256, 00:30:09.739 "data_size": 7936 00:30:09.739 }, 00:30:09.739 { 00:30:09.739 "name": "BaseBdev2", 00:30:09.739 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:09.739 "is_configured": true, 00:30:09.739 "data_offset": 256, 00:30:09.739 "data_size": 7936 00:30:09.739 } 00:30:09.739 ] 00:30:09.739 }' 00:30:09.739 20:44:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:09.739 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:09.739 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:09.739 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.740 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:09.998 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:09.998 "name": "raid_bdev1", 00:30:09.998 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:09.998 "strip_size_kb": 0, 00:30:09.998 "state": "online", 00:30:09.998 "raid_level": "raid1", 00:30:09.998 "superblock": true, 00:30:09.998 "num_base_bdevs": 2, 00:30:09.998 "num_base_bdevs_discovered": 2, 00:30:09.998 "num_base_bdevs_operational": 2, 00:30:09.998 "base_bdevs_list": [ 00:30:09.998 { 00:30:09.998 "name": "spare", 00:30:09.998 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:09.998 "is_configured": true, 00:30:09.998 "data_offset": 256, 00:30:09.998 "data_size": 7936 00:30:09.998 }, 00:30:09.998 { 00:30:09.998 "name": "BaseBdev2", 00:30:09.998 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:09.998 "is_configured": true, 00:30:09.998 "data_offset": 256, 00:30:09.998 "data_size": 7936 00:30:09.998 } 00:30:09.998 ] 00:30:09.998 }' 00:30:09.998 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:09.998 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:10.563 20:44:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:10.820 [2024-07-15 20:44:03.138951] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:10.820 [2024-07-15 20:44:03.138982] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:10.820 [2024-07-15 20:44:03.139042] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:10.820 [2024-07-15 20:44:03.139102] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:10.820 [2024-07-15 20:44:03.139114] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27f0370 name raid_bdev1, state offline 00:30:10.820 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:10.820 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:30:11.084 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:30:11.084 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:30:11.084 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:30:11.084 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:11.342 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:11.598 [2024-07-15 20:44:03.884886] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:11.598 [2024-07-15 20:44:03.884939] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:11.598 [2024-07-15 20:44:03.884960] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27f0040 00:30:11.598 [2024-07-15 20:44:03.884978] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:11.598 [2024-07-15 20:44:03.886447] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:11.598 [2024-07-15 20:44:03.886474] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:11.598 [2024-07-15 20:44:03.886532] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:11.598 [2024-07-15 20:44:03.886559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:11.598 [2024-07-15 20:44:03.886649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:11.598 spare 00:30:11.598 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:11.598 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:11.598 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:11.598 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:11.598 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:11.598 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:11.598 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:11.599 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:11.599 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:11.599 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:11.599 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:11.599 20:44:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:11.855 [2024-07-15 20:44:03.986960] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27f20d0 00:30:11.855 [2024-07-15 20:44:03.986977] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:11.855 [2024-07-15 20:44:03.987058] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27e5500 00:30:11.855 [2024-07-15 20:44:03.987155] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27f20d0 00:30:11.855 [2024-07-15 20:44:03.987165] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27f20d0 00:30:11.855 [2024-07-15 20:44:03.987231] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:11.855 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:11.855 "name": "raid_bdev1", 00:30:11.855 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:11.855 "strip_size_kb": 0, 00:30:11.855 "state": "online", 00:30:11.855 "raid_level": "raid1", 00:30:11.855 "superblock": true, 00:30:11.855 "num_base_bdevs": 2, 00:30:11.855 "num_base_bdevs_discovered": 2, 00:30:11.855 "num_base_bdevs_operational": 2, 00:30:11.855 "base_bdevs_list": [ 00:30:11.855 { 00:30:11.855 "name": "spare", 00:30:11.855 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:11.855 "is_configured": true, 00:30:11.855 "data_offset": 256, 00:30:11.855 "data_size": 7936 00:30:11.855 }, 00:30:11.855 { 00:30:11.855 "name": "BaseBdev2", 00:30:11.855 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:11.855 "is_configured": true, 00:30:11.855 "data_offset": 256, 00:30:11.855 "data_size": 7936 00:30:11.856 } 00:30:11.856 ] 00:30:11.856 }' 00:30:11.856 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:11.856 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:12.420 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:12.420 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:12.420 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:12.420 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:12.420 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:12.420 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:12.420 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:12.678 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:12.678 "name": "raid_bdev1", 00:30:12.678 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:12.678 "strip_size_kb": 0, 00:30:12.678 "state": "online", 00:30:12.678 "raid_level": "raid1", 00:30:12.678 "superblock": true, 00:30:12.678 "num_base_bdevs": 2, 00:30:12.678 "num_base_bdevs_discovered": 2, 00:30:12.678 "num_base_bdevs_operational": 2, 00:30:12.678 "base_bdevs_list": [ 00:30:12.678 { 00:30:12.678 "name": "spare", 00:30:12.678 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:12.678 "is_configured": true, 00:30:12.678 "data_offset": 256, 00:30:12.678 "data_size": 7936 00:30:12.678 }, 00:30:12.678 { 00:30:12.678 "name": "BaseBdev2", 00:30:12.678 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:12.678 "is_configured": true, 00:30:12.678 "data_offset": 256, 00:30:12.678 "data_size": 7936 00:30:12.678 } 00:30:12.678 ] 00:30:12.678 }' 00:30:12.678 20:44:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:12.678 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:12.678 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:12.936 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:12.936 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:12.936 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:13.232 [2024-07-15 20:44:05.561444] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:13.232 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:13.490 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:13.490 "name": "raid_bdev1", 00:30:13.490 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:13.490 "strip_size_kb": 0, 00:30:13.490 "state": "online", 00:30:13.490 "raid_level": "raid1", 00:30:13.490 "superblock": true, 00:30:13.490 "num_base_bdevs": 2, 00:30:13.490 "num_base_bdevs_discovered": 1, 00:30:13.490 "num_base_bdevs_operational": 1, 00:30:13.490 "base_bdevs_list": [ 00:30:13.490 { 00:30:13.490 "name": null, 00:30:13.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:13.490 "is_configured": false, 00:30:13.490 "data_offset": 256, 00:30:13.490 "data_size": 7936 00:30:13.490 }, 00:30:13.490 { 00:30:13.490 "name": "BaseBdev2", 00:30:13.490 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:13.490 "is_configured": true, 00:30:13.490 "data_offset": 256, 00:30:13.490 "data_size": 7936 00:30:13.490 } 00:30:13.490 ] 00:30:13.490 }' 00:30:13.490 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:13.490 20:44:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:14.425 20:44:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:14.425 [2024-07-15 20:44:06.684445] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:14.425 [2024-07-15 20:44:06.684601] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:14.425 [2024-07-15 20:44:06.684617] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:14.425 [2024-07-15 20:44:06.684647] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:14.425 [2024-07-15 20:44:06.688160] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2657640 00:30:14.425 [2024-07-15 20:44:06.689573] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:14.425 20:44:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:30:15.382 20:44:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:15.382 20:44:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:15.382 20:44:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:15.382 20:44:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:15.382 20:44:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:15.382 20:44:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:15.382 20:44:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:15.640 20:44:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:15.640 "name": "raid_bdev1", 00:30:15.640 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:15.640 "strip_size_kb": 0, 00:30:15.640 "state": "online", 00:30:15.640 "raid_level": "raid1", 00:30:15.640 "superblock": true, 00:30:15.640 "num_base_bdevs": 2, 00:30:15.640 "num_base_bdevs_discovered": 2, 00:30:15.640 "num_base_bdevs_operational": 2, 00:30:15.640 "process": { 00:30:15.640 "type": "rebuild", 00:30:15.640 "target": "spare", 00:30:15.640 "progress": { 00:30:15.640 "blocks": 3072, 00:30:15.640 "percent": 38 00:30:15.640 } 00:30:15.640 }, 00:30:15.640 "base_bdevs_list": [ 00:30:15.640 { 00:30:15.640 "name": "spare", 00:30:15.640 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:15.640 "is_configured": true, 00:30:15.640 "data_offset": 256, 00:30:15.640 "data_size": 7936 00:30:15.640 }, 00:30:15.640 { 00:30:15.640 "name": "BaseBdev2", 00:30:15.640 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:15.640 "is_configured": true, 00:30:15.640 "data_offset": 256, 00:30:15.640 "data_size": 7936 00:30:15.640 } 00:30:15.640 ] 00:30:15.640 }' 00:30:15.640 20:44:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:15.640 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:15.640 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:15.898 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:15.898 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:16.465 [2024-07-15 20:44:08.539820] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:16.465 [2024-07-15 20:44:08.604445] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:16.465 [2024-07-15 20:44:08.604491] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:16.465 [2024-07-15 20:44:08.604506] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:16.465 [2024-07-15 20:44:08.604515] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:16.465 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:16.724 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:16.724 "name": "raid_bdev1", 00:30:16.724 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:16.724 "strip_size_kb": 0, 00:30:16.724 "state": "online", 00:30:16.724 "raid_level": "raid1", 00:30:16.724 "superblock": true, 00:30:16.724 "num_base_bdevs": 2, 00:30:16.724 "num_base_bdevs_discovered": 1, 00:30:16.724 "num_base_bdevs_operational": 1, 00:30:16.724 "base_bdevs_list": [ 00:30:16.724 { 00:30:16.724 "name": null, 00:30:16.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:16.724 "is_configured": false, 00:30:16.724 "data_offset": 256, 00:30:16.724 "data_size": 7936 00:30:16.724 }, 00:30:16.724 { 00:30:16.724 "name": "BaseBdev2", 00:30:16.724 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:16.724 "is_configured": true, 00:30:16.724 "data_offset": 256, 00:30:16.724 "data_size": 7936 00:30:16.724 } 00:30:16.724 ] 00:30:16.724 }' 00:30:16.724 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:16.724 20:44:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:17.657 20:44:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:17.915 [2024-07-15 20:44:10.240854] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:17.915 [2024-07-15 20:44:10.240916] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:17.915 [2024-07-15 20:44:10.240948] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27efc80 00:30:17.915 [2024-07-15 20:44:10.240962] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:17.915 [2024-07-15 20:44:10.241183] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:17.915 [2024-07-15 20:44:10.241199] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:17.915 [2024-07-15 20:44:10.241270] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:17.915 [2024-07-15 20:44:10.241284] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:17.915 [2024-07-15 20:44:10.241295] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:17.915 [2024-07-15 20:44:10.241315] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:17.915 [2024-07-15 20:44:10.245371] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27f02d0 00:30:17.915 [2024-07-15 20:44:10.246763] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:17.915 spare 00:30:17.915 20:44:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:19.288 "name": "raid_bdev1", 00:30:19.288 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:19.288 "strip_size_kb": 0, 00:30:19.288 "state": "online", 00:30:19.288 "raid_level": "raid1", 00:30:19.288 "superblock": true, 00:30:19.288 "num_base_bdevs": 2, 00:30:19.288 "num_base_bdevs_discovered": 2, 00:30:19.288 "num_base_bdevs_operational": 2, 00:30:19.288 "process": { 00:30:19.288 "type": "rebuild", 00:30:19.288 "target": "spare", 00:30:19.288 "progress": { 00:30:19.288 "blocks": 3072, 00:30:19.288 "percent": 38 00:30:19.288 } 00:30:19.288 }, 00:30:19.288 "base_bdevs_list": [ 00:30:19.288 { 00:30:19.288 "name": "spare", 00:30:19.288 "uuid": "d13d496c-3dba-5ea3-a490-6e36fe5d472a", 00:30:19.288 "is_configured": true, 00:30:19.288 "data_offset": 256, 00:30:19.288 "data_size": 7936 00:30:19.288 }, 00:30:19.288 { 00:30:19.288 "name": "BaseBdev2", 00:30:19.288 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:19.288 "is_configured": true, 00:30:19.288 "data_offset": 256, 00:30:19.288 "data_size": 7936 00:30:19.288 } 00:30:19.288 ] 00:30:19.288 }' 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:19.288 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:19.546 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:19.546 20:44:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:19.805 [2024-07-15 20:44:12.159864] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:19.805 [2024-07-15 20:44:12.161695] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:19.805 [2024-07-15 20:44:12.161740] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:19.805 [2024-07-15 20:44:12.161755] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:19.805 [2024-07-15 20:44:12.161763] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:20.063 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:20.629 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:20.629 "name": "raid_bdev1", 00:30:20.629 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:20.629 "strip_size_kb": 0, 00:30:20.629 "state": "online", 00:30:20.629 "raid_level": "raid1", 00:30:20.629 "superblock": true, 00:30:20.629 "num_base_bdevs": 2, 00:30:20.629 "num_base_bdevs_discovered": 1, 00:30:20.629 "num_base_bdevs_operational": 1, 00:30:20.629 "base_bdevs_list": [ 00:30:20.629 { 00:30:20.629 "name": null, 00:30:20.629 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:20.629 "is_configured": false, 00:30:20.629 "data_offset": 256, 00:30:20.629 "data_size": 7936 00:30:20.629 }, 00:30:20.629 { 00:30:20.629 "name": "BaseBdev2", 00:30:20.629 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:20.629 "is_configured": true, 00:30:20.629 "data_offset": 256, 00:30:20.629 "data_size": 7936 00:30:20.629 } 00:30:20.629 ] 00:30:20.629 }' 00:30:20.629 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:20.629 20:44:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:21.195 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:21.195 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:21.195 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:21.195 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:21.195 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:21.195 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:21.195 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:21.452 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:21.452 "name": "raid_bdev1", 00:30:21.452 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:21.452 "strip_size_kb": 0, 00:30:21.452 "state": "online", 00:30:21.452 "raid_level": "raid1", 00:30:21.452 "superblock": true, 00:30:21.452 "num_base_bdevs": 2, 00:30:21.452 "num_base_bdevs_discovered": 1, 00:30:21.452 "num_base_bdevs_operational": 1, 00:30:21.452 "base_bdevs_list": [ 00:30:21.452 { 00:30:21.452 "name": null, 00:30:21.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:21.452 "is_configured": false, 00:30:21.452 "data_offset": 256, 00:30:21.452 "data_size": 7936 00:30:21.452 }, 00:30:21.452 { 00:30:21.452 "name": "BaseBdev2", 00:30:21.452 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:21.452 "is_configured": true, 00:30:21.452 "data_offset": 256, 00:30:21.452 "data_size": 7936 00:30:21.452 } 00:30:21.452 ] 00:30:21.452 }' 00:30:21.452 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:21.710 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:21.710 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:21.710 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:21.710 20:44:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:30:22.276 20:44:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:22.842 [2024-07-15 20:44:14.917307] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:22.842 [2024-07-15 20:44:14.917363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:22.842 [2024-07-15 20:44:14.917388] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2657fa0 00:30:22.842 [2024-07-15 20:44:14.917401] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:22.842 [2024-07-15 20:44:14.917579] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:22.842 [2024-07-15 20:44:14.917595] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:22.842 [2024-07-15 20:44:14.917646] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:30:22.842 [2024-07-15 20:44:14.917658] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:22.842 [2024-07-15 20:44:14.917670] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:22.842 BaseBdev1 00:30:22.842 20:44:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:23.775 20:44:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:24.340 20:44:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:24.340 "name": "raid_bdev1", 00:30:24.340 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:24.340 "strip_size_kb": 0, 00:30:24.340 "state": "online", 00:30:24.340 "raid_level": "raid1", 00:30:24.340 "superblock": true, 00:30:24.340 "num_base_bdevs": 2, 00:30:24.340 "num_base_bdevs_discovered": 1, 00:30:24.340 "num_base_bdevs_operational": 1, 00:30:24.340 "base_bdevs_list": [ 00:30:24.340 { 00:30:24.340 "name": null, 00:30:24.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:24.340 "is_configured": false, 00:30:24.340 "data_offset": 256, 00:30:24.340 "data_size": 7936 00:30:24.340 }, 00:30:24.340 { 00:30:24.340 "name": "BaseBdev2", 00:30:24.340 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:24.340 "is_configured": true, 00:30:24.340 "data_offset": 256, 00:30:24.340 "data_size": 7936 00:30:24.340 } 00:30:24.340 ] 00:30:24.340 }' 00:30:24.340 20:44:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:24.340 20:44:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:25.271 "name": "raid_bdev1", 00:30:25.271 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:25.271 "strip_size_kb": 0, 00:30:25.271 "state": "online", 00:30:25.271 "raid_level": "raid1", 00:30:25.271 "superblock": true, 00:30:25.271 "num_base_bdevs": 2, 00:30:25.271 "num_base_bdevs_discovered": 1, 00:30:25.271 "num_base_bdevs_operational": 1, 00:30:25.271 "base_bdevs_list": [ 00:30:25.271 { 00:30:25.271 "name": null, 00:30:25.271 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:25.271 "is_configured": false, 00:30:25.271 "data_offset": 256, 00:30:25.271 "data_size": 7936 00:30:25.271 }, 00:30:25.271 { 00:30:25.271 "name": "BaseBdev2", 00:30:25.271 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:25.271 "is_configured": true, 00:30:25.271 "data_offset": 256, 00:30:25.271 "data_size": 7936 00:30:25.271 } 00:30:25.271 ] 00:30:25.271 }' 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:25.271 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:25.529 [2024-07-15 20:44:17.881208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:25.529 [2024-07-15 20:44:17.881356] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:25.529 [2024-07-15 20:44:17.881372] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:25.529 request: 00:30:25.529 { 00:30:25.529 "base_bdev": "BaseBdev1", 00:30:25.529 "raid_bdev": "raid_bdev1", 00:30:25.529 "method": "bdev_raid_add_base_bdev", 00:30:25.529 "req_id": 1 00:30:25.529 } 00:30:25.529 Got JSON-RPC error response 00:30:25.529 response: 00:30:25.529 { 00:30:25.529 "code": -22, 00:30:25.529 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:30:25.529 } 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:30:25.529 20:44:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:26.901 20:44:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:26.901 20:44:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:26.901 "name": "raid_bdev1", 00:30:26.901 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:26.901 "strip_size_kb": 0, 00:30:26.901 "state": "online", 00:30:26.901 "raid_level": "raid1", 00:30:26.901 "superblock": true, 00:30:26.901 "num_base_bdevs": 2, 00:30:26.901 "num_base_bdevs_discovered": 1, 00:30:26.901 "num_base_bdevs_operational": 1, 00:30:26.901 "base_bdevs_list": [ 00:30:26.901 { 00:30:26.901 "name": null, 00:30:26.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:26.901 "is_configured": false, 00:30:26.901 "data_offset": 256, 00:30:26.901 "data_size": 7936 00:30:26.901 }, 00:30:26.901 { 00:30:26.901 "name": "BaseBdev2", 00:30:26.902 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:26.902 "is_configured": true, 00:30:26.902 "data_offset": 256, 00:30:26.902 "data_size": 7936 00:30:26.902 } 00:30:26.902 ] 00:30:26.902 }' 00:30:26.902 20:44:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:26.902 20:44:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:27.468 20:44:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:27.468 20:44:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:27.468 20:44:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:27.468 20:44:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:27.468 20:44:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:27.468 20:44:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:27.468 20:44:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:27.726 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:27.726 "name": "raid_bdev1", 00:30:27.726 "uuid": "966d3817-3977-484b-a1e3-d44243e85940", 00:30:27.726 "strip_size_kb": 0, 00:30:27.726 "state": "online", 00:30:27.726 "raid_level": "raid1", 00:30:27.726 "superblock": true, 00:30:27.726 "num_base_bdevs": 2, 00:30:27.726 "num_base_bdevs_discovered": 1, 00:30:27.726 "num_base_bdevs_operational": 1, 00:30:27.726 "base_bdevs_list": [ 00:30:27.726 { 00:30:27.726 "name": null, 00:30:27.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:27.726 "is_configured": false, 00:30:27.726 "data_offset": 256, 00:30:27.726 "data_size": 7936 00:30:27.726 }, 00:30:27.726 { 00:30:27.726 "name": "BaseBdev2", 00:30:27.726 "uuid": "f5ef8273-efb1-5ca5-95e3-2d6a0ac5ec56", 00:30:27.726 "is_configured": true, 00:30:27.726 "data_offset": 256, 00:30:27.726 "data_size": 7936 00:30:27.726 } 00:30:27.726 ] 00:30:27.726 }' 00:30:27.726 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:27.726 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:27.726 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:27.726 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:27.726 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 1514032 00:30:27.726 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1514032 ']' 00:30:27.726 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1514032 00:30:27.726 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:30:27.985 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:27.985 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1514032 00:30:27.985 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:27.985 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:27.985 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1514032' 00:30:27.985 killing process with pid 1514032 00:30:27.985 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 1514032 00:30:27.985 Received shutdown signal, test time was about 60.000000 seconds 00:30:27.985 00:30:27.985 Latency(us) 00:30:27.985 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:27.985 =================================================================================================================== 00:30:27.985 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:30:27.985 [2024-07-15 20:44:20.143817] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:27.985 [2024-07-15 20:44:20.143915] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:27.985 [2024-07-15 20:44:20.143970] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:27.985 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 1514032 00:30:27.985 [2024-07-15 20:44:20.143983] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27f20d0 name raid_bdev1, state offline 00:30:27.985 [2024-07-15 20:44:20.173901] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:28.244 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:30:28.244 00:30:28.244 real 0m31.804s 00:30:28.244 user 0m51.417s 00:30:28.244 sys 0m4.228s 00:30:28.244 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:28.244 20:44:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:28.244 ************************************ 00:30:28.244 END TEST raid_rebuild_test_sb_md_interleaved 00:30:28.244 ************************************ 00:30:28.244 20:44:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:30:28.244 20:44:20 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:30:28.244 20:44:20 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:30:28.244 20:44:20 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1514032 ']' 00:30:28.244 20:44:20 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1514032 00:30:28.244 20:44:20 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:30:28.244 00:30:28.244 real 19m27.617s 00:30:28.244 user 33m6.626s 00:30:28.244 sys 3m31.422s 00:30:28.244 20:44:20 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:28.244 20:44:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:28.244 ************************************ 00:30:28.244 END TEST bdev_raid 00:30:28.244 ************************************ 00:30:28.244 20:44:20 -- common/autotest_common.sh@1142 -- # return 0 00:30:28.244 20:44:20 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:30:28.244 20:44:20 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:28.244 20:44:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:28.244 20:44:20 -- common/autotest_common.sh@10 -- # set +x 00:30:28.244 ************************************ 00:30:28.244 START TEST bdevperf_config 00:30:28.244 ************************************ 00:30:28.244 20:44:20 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:30:28.502 * Looking for test storage... 00:30:28.502 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:28.502 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:28.502 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:28.502 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:28.502 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:28.502 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:28.502 20:44:20 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:31.063 20:44:23 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-15 20:44:20.779587] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:31.063 [2024-07-15 20:44:20.779658] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1518544 ] 00:30:31.063 Using job config with 4 jobs 00:30:31.063 [2024-07-15 20:44:20.908444] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:31.063 [2024-07-15 20:44:21.013508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:31.063 cpumask for '\''job0'\'' is too big 00:30:31.063 cpumask for '\''job1'\'' is too big 00:30:31.063 cpumask for '\''job2'\'' is too big 00:30:31.063 cpumask for '\''job3'\'' is too big 00:30:31.063 Running I/O for 2 seconds... 00:30:31.063 00:30:31.063 Latency(us) 00:30:31.063 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24110.13 23.55 0.00 0.00 10614.02 1866.35 16298.52 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24088.07 23.52 0.00 0.00 10599.35 1837.86 14474.91 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24066.14 23.50 0.00 0.00 10585.67 1852.10 12594.31 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24044.28 23.48 0.00 0.00 10572.08 1837.86 10884.67 00:30:31.063 =================================================================================================================== 00:30:31.063 Total : 96308.62 94.05 0.00 0.00 10592.78 1837.86 16298.52' 00:30:31.063 20:44:23 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-15 20:44:20.779587] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:31.063 [2024-07-15 20:44:20.779658] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1518544 ] 00:30:31.063 Using job config with 4 jobs 00:30:31.063 [2024-07-15 20:44:20.908444] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:31.063 [2024-07-15 20:44:21.013508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:31.063 cpumask for '\''job0'\'' is too big 00:30:31.063 cpumask for '\''job1'\'' is too big 00:30:31.063 cpumask for '\''job2'\'' is too big 00:30:31.063 cpumask for '\''job3'\'' is too big 00:30:31.063 Running I/O for 2 seconds... 00:30:31.063 00:30:31.063 Latency(us) 00:30:31.063 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24110.13 23.55 0.00 0.00 10614.02 1866.35 16298.52 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24088.07 23.52 0.00 0.00 10599.35 1837.86 14474.91 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24066.14 23.50 0.00 0.00 10585.67 1852.10 12594.31 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24044.28 23.48 0.00 0.00 10572.08 1837.86 10884.67 00:30:31.063 =================================================================================================================== 00:30:31.063 Total : 96308.62 94.05 0.00 0.00 10592.78 1837.86 16298.52' 00:30:31.063 20:44:23 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 20:44:20.779587] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:31.063 [2024-07-15 20:44:20.779658] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1518544 ] 00:30:31.063 Using job config with 4 jobs 00:30:31.063 [2024-07-15 20:44:20.908444] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:31.063 [2024-07-15 20:44:21.013508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:31.063 cpumask for '\''job0'\'' is too big 00:30:31.063 cpumask for '\''job1'\'' is too big 00:30:31.063 cpumask for '\''job2'\'' is too big 00:30:31.063 cpumask for '\''job3'\'' is too big 00:30:31.063 Running I/O for 2 seconds... 00:30:31.063 00:30:31.063 Latency(us) 00:30:31.063 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24110.13 23.55 0.00 0.00 10614.02 1866.35 16298.52 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24088.07 23.52 0.00 0.00 10599.35 1837.86 14474.91 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24066.14 23.50 0.00 0.00 10585.67 1852.10 12594.31 00:30:31.063 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:31.063 Malloc0 : 2.02 24044.28 23.48 0.00 0.00 10572.08 1837.86 10884.67 00:30:31.063 =================================================================================================================== 00:30:31.063 Total : 96308.62 94.05 0.00 0.00 10592.78 1837.86 16298.52' 00:30:31.063 20:44:23 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:31.063 20:44:23 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:31.063 20:44:23 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:30:31.063 20:44:23 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:31.322 [2024-07-15 20:44:23.500331] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:31.322 [2024-07-15 20:44:23.500407] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1518893 ] 00:30:31.322 [2024-07-15 20:44:23.643014] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:31.580 [2024-07-15 20:44:23.769953] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:31.580 cpumask for 'job0' is too big 00:30:31.580 cpumask for 'job1' is too big 00:30:31.580 cpumask for 'job2' is too big 00:30:31.580 cpumask for 'job3' is too big 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:30:34.124 Running I/O for 2 seconds... 00:30:34.124 00:30:34.124 Latency(us) 00:30:34.124 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:34.124 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:34.124 Malloc0 : 2.02 23932.84 23.37 0.00 0.00 10686.42 1894.85 16298.52 00:30:34.124 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:34.124 Malloc0 : 2.02 23911.01 23.35 0.00 0.00 10673.18 1852.10 14417.92 00:30:34.124 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:34.124 Malloc0 : 2.03 23889.33 23.33 0.00 0.00 10658.58 1823.61 12594.31 00:30:34.124 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:34.124 Malloc0 : 2.03 23867.70 23.31 0.00 0.00 10644.37 1823.61 10998.65 00:30:34.124 =================================================================================================================== 00:30:34.124 Total : 95600.88 93.36 0.00 0.00 10665.64 1823.61 16298.52' 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:34.124 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:34.124 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:34.124 00:30:34.124 20:44:26 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:34.125 20:44:26 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:36.654 20:44:28 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-15 20:44:26.305602] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:36.654 [2024-07-15 20:44:26.305669] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1519252 ] 00:30:36.654 Using job config with 3 jobs 00:30:36.654 [2024-07-15 20:44:26.445373] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:36.654 [2024-07-15 20:44:26.561060] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:36.654 cpumask for '\''job0'\'' is too big 00:30:36.654 cpumask for '\''job1'\'' is too big 00:30:36.654 cpumask for '\''job2'\'' is too big 00:30:36.654 Running I/O for 2 seconds... 00:30:36.654 00:30:36.654 Latency(us) 00:30:36.654 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:36.654 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:36.654 Malloc0 : 2.02 32642.82 31.88 0.00 0.00 7832.39 1809.36 11511.54 00:30:36.654 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:36.654 Malloc0 : 2.02 32612.84 31.85 0.00 0.00 7822.16 1795.12 9744.92 00:30:36.654 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:36.654 Malloc0 : 2.02 32583.06 31.82 0.00 0.00 7812.00 1787.99 8092.27 00:30:36.654 =================================================================================================================== 00:30:36.654 Total : 97838.72 95.55 0.00 0.00 7822.18 1787.99 11511.54' 00:30:36.654 20:44:28 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-15 20:44:26.305602] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:36.655 [2024-07-15 20:44:26.305669] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1519252 ] 00:30:36.655 Using job config with 3 jobs 00:30:36.655 [2024-07-15 20:44:26.445373] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:36.655 [2024-07-15 20:44:26.561060] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:36.655 cpumask for '\''job0'\'' is too big 00:30:36.655 cpumask for '\''job1'\'' is too big 00:30:36.655 cpumask for '\''job2'\'' is too big 00:30:36.655 Running I/O for 2 seconds... 00:30:36.655 00:30:36.655 Latency(us) 00:30:36.655 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:36.655 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:36.655 Malloc0 : 2.02 32642.82 31.88 0.00 0.00 7832.39 1809.36 11511.54 00:30:36.655 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:36.655 Malloc0 : 2.02 32612.84 31.85 0.00 0.00 7822.16 1795.12 9744.92 00:30:36.655 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:36.655 Malloc0 : 2.02 32583.06 31.82 0.00 0.00 7812.00 1787.99 8092.27 00:30:36.655 =================================================================================================================== 00:30:36.655 Total : 97838.72 95.55 0.00 0.00 7822.18 1787.99 11511.54' 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 20:44:26.305602] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:36.655 [2024-07-15 20:44:26.305669] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1519252 ] 00:30:36.655 Using job config with 3 jobs 00:30:36.655 [2024-07-15 20:44:26.445373] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:36.655 [2024-07-15 20:44:26.561060] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:36.655 cpumask for '\''job0'\'' is too big 00:30:36.655 cpumask for '\''job1'\'' is too big 00:30:36.655 cpumask for '\''job2'\'' is too big 00:30:36.655 Running I/O for 2 seconds... 00:30:36.655 00:30:36.655 Latency(us) 00:30:36.655 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:36.655 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:36.655 Malloc0 : 2.02 32642.82 31.88 0.00 0.00 7832.39 1809.36 11511.54 00:30:36.655 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:36.655 Malloc0 : 2.02 32612.84 31.85 0.00 0.00 7822.16 1795.12 9744.92 00:30:36.655 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:36.655 Malloc0 : 2.02 32583.06 31.82 0.00 0.00 7812.00 1787.99 8092.27 00:30:36.655 =================================================================================================================== 00:30:36.655 Total : 97838.72 95.55 0.00 0.00 7822.18 1787.99 11511.54' 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:36.655 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:36.655 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:36.655 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:36.655 00:30:36.655 20:44:28 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:36.655 20:44:29 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:30:36.655 20:44:29 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:30:36.655 20:44:29 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:36.655 20:44:29 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:36.655 20:44:29 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:30:36.655 20:44:29 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:30:36.655 20:44:29 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:36.655 00:30:36.655 20:44:29 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:36.655 20:44:29 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:39.943 20:44:31 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-15 20:44:29.070492] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:39.943 [2024-07-15 20:44:29.070563] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1519612 ] 00:30:39.943 Using job config with 4 jobs 00:30:39.943 [2024-07-15 20:44:29.224325] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:39.944 [2024-07-15 20:44:29.340439] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:39.944 cpumask for '\''job0'\'' is too big 00:30:39.944 cpumask for '\''job1'\'' is too big 00:30:39.944 cpumask for '\''job2'\'' is too big 00:30:39.944 cpumask for '\''job3'\'' is too big 00:30:39.944 Running I/O for 2 seconds... 00:30:39.944 00:30:39.944 Latency(us) 00:30:39.944 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.03 11962.47 11.68 0.00 0.00 21384.09 3789.69 33052.94 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.03 11951.22 11.67 0.00 0.00 21384.15 4644.51 33052.94 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.04 11940.37 11.66 0.00 0.00 21322.93 3732.70 29177.77 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.04 11929.29 11.65 0.00 0.00 21322.46 4587.52 29177.77 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.04 11918.20 11.64 0.00 0.00 21268.84 3761.20 25416.57 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.04 11907.16 11.63 0.00 0.00 21268.11 4616.01 25416.57 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.05 11989.97 11.71 0.00 0.00 21047.27 3618.73 22567.18 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.05 11978.87 11.70 0.00 0.00 21047.02 2849.39 22567.18 00:30:39.944 =================================================================================================================== 00:30:39.944 Total : 95577.56 93.34 0.00 0.00 21255.06 2849.39 33052.94' 00:30:39.944 20:44:31 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-15 20:44:29.070492] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:39.944 [2024-07-15 20:44:29.070563] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1519612 ] 00:30:39.944 Using job config with 4 jobs 00:30:39.944 [2024-07-15 20:44:29.224325] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:39.944 [2024-07-15 20:44:29.340439] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:39.944 cpumask for '\''job0'\'' is too big 00:30:39.944 cpumask for '\''job1'\'' is too big 00:30:39.944 cpumask for '\''job2'\'' is too big 00:30:39.944 cpumask for '\''job3'\'' is too big 00:30:39.944 Running I/O for 2 seconds... 00:30:39.944 00:30:39.944 Latency(us) 00:30:39.944 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.03 11962.47 11.68 0.00 0.00 21384.09 3789.69 33052.94 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.03 11951.22 11.67 0.00 0.00 21384.15 4644.51 33052.94 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.04 11940.37 11.66 0.00 0.00 21322.93 3732.70 29177.77 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.04 11929.29 11.65 0.00 0.00 21322.46 4587.52 29177.77 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.04 11918.20 11.64 0.00 0.00 21268.84 3761.20 25416.57 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.04 11907.16 11.63 0.00 0.00 21268.11 4616.01 25416.57 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.05 11989.97 11.71 0.00 0.00 21047.27 3618.73 22567.18 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.05 11978.87 11.70 0.00 0.00 21047.02 2849.39 22567.18 00:30:39.944 =================================================================================================================== 00:30:39.944 Total : 95577.56 93.34 0.00 0.00 21255.06 2849.39 33052.94' 00:30:39.944 20:44:31 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 20:44:29.070492] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:39.944 [2024-07-15 20:44:29.070563] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1519612 ] 00:30:39.944 Using job config with 4 jobs 00:30:39.944 [2024-07-15 20:44:29.224325] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:39.944 [2024-07-15 20:44:29.340439] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:39.944 cpumask for '\''job0'\'' is too big 00:30:39.944 cpumask for '\''job1'\'' is too big 00:30:39.944 cpumask for '\''job2'\'' is too big 00:30:39.944 cpumask for '\''job3'\'' is too big 00:30:39.944 Running I/O for 2 seconds... 00:30:39.944 00:30:39.944 Latency(us) 00:30:39.944 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.03 11962.47 11.68 0.00 0.00 21384.09 3789.69 33052.94 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.03 11951.22 11.67 0.00 0.00 21384.15 4644.51 33052.94 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.04 11940.37 11.66 0.00 0.00 21322.93 3732.70 29177.77 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.04 11929.29 11.65 0.00 0.00 21322.46 4587.52 29177.77 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.04 11918.20 11.64 0.00 0.00 21268.84 3761.20 25416.57 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.04 11907.16 11.63 0.00 0.00 21268.11 4616.01 25416.57 00:30:39.944 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc0 : 2.05 11989.97 11.71 0.00 0.00 21047.27 3618.73 22567.18 00:30:39.944 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:39.944 Malloc1 : 2.05 11978.87 11.70 0.00 0.00 21047.02 2849.39 22567.18 00:30:39.944 =================================================================================================================== 00:30:39.944 Total : 95577.56 93.34 0.00 0.00 21255.06 2849.39 33052.94' 00:30:39.944 20:44:31 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:39.944 20:44:31 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:39.944 20:44:31 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:30:39.944 20:44:31 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:30:39.944 20:44:31 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:39.944 20:44:31 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:30:39.944 00:30:39.944 real 0m11.259s 00:30:39.944 user 0m9.913s 00:30:39.944 sys 0m1.200s 00:30:39.944 20:44:31 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:39.944 20:44:31 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:30:39.944 ************************************ 00:30:39.944 END TEST bdevperf_config 00:30:39.944 ************************************ 00:30:39.944 20:44:31 -- common/autotest_common.sh@1142 -- # return 0 00:30:39.944 20:44:31 -- spdk/autotest.sh@192 -- # uname -s 00:30:39.944 20:44:31 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:30:39.944 20:44:31 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:39.944 20:44:31 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:39.944 20:44:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:39.944 20:44:31 -- common/autotest_common.sh@10 -- # set +x 00:30:39.944 ************************************ 00:30:39.944 START TEST reactor_set_interrupt 00:30:39.944 ************************************ 00:30:39.944 20:44:31 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:39.944 * Looking for test storage... 00:30:39.944 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:39.944 20:44:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:39.944 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:39.944 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:39.944 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:39.944 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:39.944 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:39.944 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:39.944 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:39.944 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:30:39.944 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:39.944 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:39.944 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:39.944 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:39.944 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:39.945 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:39.945 20:44:32 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:39.945 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:39.945 20:44:32 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:39.945 #define SPDK_CONFIG_H 00:30:39.945 #define SPDK_CONFIG_APPS 1 00:30:39.945 #define SPDK_CONFIG_ARCH native 00:30:39.945 #undef SPDK_CONFIG_ASAN 00:30:39.945 #undef SPDK_CONFIG_AVAHI 00:30:39.945 #undef SPDK_CONFIG_CET 00:30:39.945 #define SPDK_CONFIG_COVERAGE 1 00:30:39.945 #define SPDK_CONFIG_CROSS_PREFIX 00:30:39.945 #define SPDK_CONFIG_CRYPTO 1 00:30:39.945 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:39.945 #undef SPDK_CONFIG_CUSTOMOCF 00:30:39.945 #undef SPDK_CONFIG_DAOS 00:30:39.945 #define SPDK_CONFIG_DAOS_DIR 00:30:39.945 #define SPDK_CONFIG_DEBUG 1 00:30:39.945 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:39.945 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:39.945 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:39.945 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:39.945 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:39.945 #undef SPDK_CONFIG_DPDK_UADK 00:30:39.945 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:39.945 #define SPDK_CONFIG_EXAMPLES 1 00:30:39.945 #undef SPDK_CONFIG_FC 00:30:39.945 #define SPDK_CONFIG_FC_PATH 00:30:39.945 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:39.945 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:39.945 #undef SPDK_CONFIG_FUSE 00:30:39.945 #undef SPDK_CONFIG_FUZZER 00:30:39.945 #define SPDK_CONFIG_FUZZER_LIB 00:30:39.945 #undef SPDK_CONFIG_GOLANG 00:30:39.945 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:39.945 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:39.945 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:39.945 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:39.945 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:39.945 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:39.945 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:39.945 #define SPDK_CONFIG_IDXD 1 00:30:39.945 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:39.945 #define SPDK_CONFIG_IPSEC_MB 1 00:30:39.945 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:39.945 #define SPDK_CONFIG_ISAL 1 00:30:39.945 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:39.945 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:39.945 #define SPDK_CONFIG_LIBDIR 00:30:39.945 #undef SPDK_CONFIG_LTO 00:30:39.945 #define SPDK_CONFIG_MAX_LCORES 128 00:30:39.945 #define SPDK_CONFIG_NVME_CUSE 1 00:30:39.945 #undef SPDK_CONFIG_OCF 00:30:39.946 #define SPDK_CONFIG_OCF_PATH 00:30:39.946 #define SPDK_CONFIG_OPENSSL_PATH 00:30:39.946 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:39.946 #define SPDK_CONFIG_PGO_DIR 00:30:39.946 #undef SPDK_CONFIG_PGO_USE 00:30:39.946 #define SPDK_CONFIG_PREFIX /usr/local 00:30:39.946 #undef SPDK_CONFIG_RAID5F 00:30:39.946 #undef SPDK_CONFIG_RBD 00:30:39.946 #define SPDK_CONFIG_RDMA 1 00:30:39.946 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:39.946 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:39.946 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:39.946 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:39.946 #define SPDK_CONFIG_SHARED 1 00:30:39.946 #undef SPDK_CONFIG_SMA 00:30:39.946 #define SPDK_CONFIG_TESTS 1 00:30:39.946 #undef SPDK_CONFIG_TSAN 00:30:39.946 #define SPDK_CONFIG_UBLK 1 00:30:39.946 #define SPDK_CONFIG_UBSAN 1 00:30:39.946 #undef SPDK_CONFIG_UNIT_TESTS 00:30:39.946 #undef SPDK_CONFIG_URING 00:30:39.946 #define SPDK_CONFIG_URING_PATH 00:30:39.946 #undef SPDK_CONFIG_URING_ZNS 00:30:39.946 #undef SPDK_CONFIG_USDT 00:30:39.946 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:39.946 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:39.946 #undef SPDK_CONFIG_VFIO_USER 00:30:39.946 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:39.946 #define SPDK_CONFIG_VHOST 1 00:30:39.946 #define SPDK_CONFIG_VIRTIO 1 00:30:39.946 #undef SPDK_CONFIG_VTUNE 00:30:39.946 #define SPDK_CONFIG_VTUNE_DIR 00:30:39.946 #define SPDK_CONFIG_WERROR 1 00:30:39.946 #define SPDK_CONFIG_WPDK_DIR 00:30:39.946 #undef SPDK_CONFIG_XNVME 00:30:39.946 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:39.946 20:44:32 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:39.946 20:44:32 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:39.946 20:44:32 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:39.946 20:44:32 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:39.946 20:44:32 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:39.946 20:44:32 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:39.946 20:44:32 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:39.946 20:44:32 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:30:39.946 20:44:32 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:39.946 20:44:32 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:39.946 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 1520006 ]] 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 1520006 00:30:39.947 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.yuYZdK 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.yuYZdK/tests/interrupt /tmp/spdk.yuYZdK 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88641466368 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5867048960 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892201984 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9502720 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253377024 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=880640 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:39.948 * Looking for test storage... 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88641466368 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=8081641472 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:39.948 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:39.948 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:30:39.948 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:39.948 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:39.948 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:39.948 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:39.948 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:39.948 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:39.948 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:39.949 20:44:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:39.949 20:44:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:39.949 20:44:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:30:39.949 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:39.949 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:39.949 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1520049 00:30:39.949 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:39.949 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:39.949 20:44:32 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1520049 /var/tmp/spdk.sock 00:30:39.949 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 1520049 ']' 00:30:39.949 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:39.949 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:39.949 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:39.949 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:39.949 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:39.949 20:44:32 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:39.949 [2024-07-15 20:44:32.250263] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:39.949 [2024-07-15 20:44:32.250332] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1520049 ] 00:30:40.208 [2024-07-15 20:44:32.379641] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:40.208 [2024-07-15 20:44:32.485707] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:40.208 [2024-07-15 20:44:32.485797] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:40.208 [2024-07-15 20:44:32.485798] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:40.208 [2024-07-15 20:44:32.557572] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:41.144 20:44:33 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:41.144 20:44:33 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:41.144 20:44:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:30:41.144 20:44:33 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:41.144 Malloc0 00:30:41.144 Malloc1 00:30:41.144 Malloc2 00:30:41.144 20:44:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:30:41.144 20:44:33 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:41.144 20:44:33 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:41.144 20:44:33 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:41.403 5000+0 records in 00:30:41.403 5000+0 records out 00:30:41.403 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0243612 s, 420 MB/s 00:30:41.403 20:44:33 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:41.662 AIO0 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1520049 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1520049 without_thd 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1520049 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:41.662 20:44:33 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:41.920 20:44:34 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:41.920 20:44:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:41.920 20:44:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:41.920 20:44:34 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:41.920 20:44:34 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:41.920 20:44:34 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:41.920 20:44:34 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:41.920 20:44:34 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:41.920 20:44:34 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:42.487 spdk_thread ids are 1 on reactor0. 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1520049 0 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1520049 0 idle 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520049 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520049 -w 256 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520049 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.40 reactor_0' 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520049 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.40 reactor_0 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:42.487 20:44:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1520049 1 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1520049 1 idle 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520049 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:42.488 20:44:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520049 -w 256 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520083 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_1' 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520083 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_1 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1520049 2 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1520049 2 idle 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520049 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:42.745 20:44:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:42.745 20:44:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520049 -w 256 00:30:42.745 20:44:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520084 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_2' 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520084 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.00 reactor_2 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:30:43.004 [2024-07-15 20:44:35.330872] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:43.004 20:44:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:43.262 [2024-07-15 20:44:35.594575] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:43.262 [2024-07-15 20:44:35.595049] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:43.262 20:44:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:43.520 [2024-07-15 20:44:35.862474] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:43.520 [2024-07-15 20:44:35.862661] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:43.520 20:44:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:43.520 20:44:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1520049 0 00:30:43.520 20:44:35 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1520049 0 busy 00:30:43.520 20:44:35 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520049 00:30:43.520 20:44:35 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:43.778 20:44:35 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:43.778 20:44:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:43.778 20:44:35 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:43.778 20:44:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:43.778 20:44:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:43.778 20:44:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520049 -w 256 00:30:43.778 20:44:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520049 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.88 reactor_0' 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520049 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.88 reactor_0 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1520049 2 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1520049 2 busy 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520049 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520049 -w 256 00:30:43.778 20:44:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520084 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.37 reactor_2' 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520084 root 20 0 128.2g 35712 23040 R 99.9 0.0 0:00.37 reactor_2 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:44.036 20:44:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:44.294 [2024-07-15 20:44:36.422464] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:44.294 [2024-07-15 20:44:36.422605] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1520049 2 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1520049 2 idle 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520049 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520049 -w 256 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520084 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.55 reactor_2' 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520084 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:00.55 reactor_2 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:44.294 20:44:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:44.553 [2024-07-15 20:44:36.806460] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:44.553 [2024-07-15 20:44:36.806640] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:44.553 20:44:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:30:44.553 20:44:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:30:44.553 20:44:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:30:44.811 [2024-07-15 20:44:37.006722] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1520049 0 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1520049 0 idle 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520049 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520049 -w 256 00:30:44.811 20:44:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520049 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:01.62 reactor_0' 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520049 root 20 0 128.2g 35712 23040 S 0.0 0.0 0:01.62 reactor_0 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:30:45.070 20:44:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1520049 00:30:45.070 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 1520049 ']' 00:30:45.070 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 1520049 00:30:45.070 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:45.070 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:45.070 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1520049 00:30:45.070 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:45.070 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:45.070 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1520049' 00:30:45.070 killing process with pid 1520049 00:30:45.070 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 1520049 00:30:45.070 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 1520049 00:30:45.328 20:44:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:30:45.328 20:44:37 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:45.328 20:44:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:30:45.328 20:44:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:45.328 20:44:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:45.328 20:44:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1520813 00:30:45.328 20:44:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:45.328 20:44:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:45.328 20:44:37 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1520813 /var/tmp/spdk.sock 00:30:45.328 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 1520813 ']' 00:30:45.328 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:45.328 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:45.328 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:45.328 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:45.328 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:45.328 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:45.328 [2024-07-15 20:44:37.566848] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:45.328 [2024-07-15 20:44:37.566919] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1520813 ] 00:30:45.328 [2024-07-15 20:44:37.706558] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:45.587 [2024-07-15 20:44:37.842358] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:45.587 [2024-07-15 20:44:37.842462] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:45.587 [2024-07-15 20:44:37.842466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:45.587 [2024-07-15 20:44:37.916360] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:45.587 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:45.587 20:44:37 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:45.587 20:44:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:30:45.587 20:44:37 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:45.846 Malloc0 00:30:45.846 Malloc1 00:30:45.846 Malloc2 00:30:45.846 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:30:45.846 20:44:38 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:45.846 20:44:38 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:45.846 20:44:38 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:45.846 5000+0 records in 00:30:45.846 5000+0 records out 00:30:45.846 10240000 bytes (10 MB, 9.8 MiB) copied, 0.027835 s, 368 MB/s 00:30:45.846 20:44:38 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:46.104 AIO0 00:30:46.104 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1520813 00:30:46.104 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1520813 00:30:46.104 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1520813 00:30:46.104 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:30:46.104 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:46.104 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:46.104 20:44:38 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:46.104 20:44:38 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:46.104 20:44:38 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:46.104 20:44:38 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:46.363 20:44:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:46.363 20:44:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:46.363 20:44:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:46.363 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:46.621 spdk_thread ids are 1 on reactor0. 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1520813 0 00:30:46.621 20:44:38 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1520813 0 idle 00:30:46.882 20:44:38 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520813 00:30:46.882 20:44:38 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:46.882 20:44:38 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:46.882 20:44:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:46.882 20:44:38 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:46.882 20:44:38 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520813 -w 256 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520813 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.44 reactor_0' 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520813 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.44 reactor_0 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1520813 1 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1520813 1 idle 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520813 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520813 -w 256 00:30:46.882 20:44:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520816 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1' 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520816 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1520813 2 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1520813 2 idle 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520813 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520813 -w 256 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:47.161 20:44:39 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520817 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2' 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520817 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:47.430 [2024-07-15 20:44:39.767273] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:47.430 [2024-07-15 20:44:39.767488] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:30:47.430 [2024-07-15 20:44:39.767700] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:47.430 20:44:39 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:47.688 [2024-07-15 20:44:40.015794] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:47.688 [2024-07-15 20:44:40.015973] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1520813 0 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1520813 0 busy 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520813 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520813 -w 256 00:30:47.688 20:44:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520813 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.88 reactor_0' 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520813 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.88 reactor_0 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1520813 2 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1520813 2 busy 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520813 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520813 -w 256 00:30:47.945 20:44:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520817 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2' 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520817 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:48.202 20:44:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:48.460 [2024-07-15 20:44:40.629576] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:48.460 [2024-07-15 20:44:40.629714] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1520813 2 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1520813 2 idle 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520813 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520813 -w 256 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520817 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.61 reactor_2' 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520817 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.61 reactor_2 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:48.460 20:44:40 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:48.461 20:44:40 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:48.461 20:44:40 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:48.719 [2024-07-15 20:44:41.062676] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:48.719 [2024-07-15 20:44:41.062893] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:30:48.719 [2024-07-15 20:44:41.062919] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1520813 0 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1520813 0 idle 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1520813 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1520813 -w 256 00:30:48.719 20:44:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1520813 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:01.73 reactor_0' 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1520813 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:01.73 reactor_0 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:30:48.979 20:44:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1520813 00:30:48.979 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 1520813 ']' 00:30:48.979 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 1520813 00:30:48.979 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:48.979 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:48.979 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1520813 00:30:48.979 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:48.979 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:48.979 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1520813' 00:30:48.979 killing process with pid 1520813 00:30:48.979 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 1520813 00:30:48.979 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 1520813 00:30:49.238 20:44:41 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:30:49.238 20:44:41 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:49.238 00:30:49.238 real 0m9.662s 00:30:49.238 user 0m9.485s 00:30:49.238 sys 0m2.154s 00:30:49.238 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:49.238 20:44:41 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:49.238 ************************************ 00:30:49.238 END TEST reactor_set_interrupt 00:30:49.238 ************************************ 00:30:49.498 20:44:41 -- common/autotest_common.sh@1142 -- # return 0 00:30:49.498 20:44:41 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:49.498 20:44:41 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:49.498 20:44:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:49.498 20:44:41 -- common/autotest_common.sh@10 -- # set +x 00:30:49.498 ************************************ 00:30:49.498 START TEST reap_unregistered_poller 00:30:49.498 ************************************ 00:30:49.498 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:49.498 * Looking for test storage... 00:30:49.498 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:49.498 20:44:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:49.498 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:49.499 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:49.499 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:49.499 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:49.499 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:49.499 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:49.499 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:49.499 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:30:49.499 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:49.499 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:49.499 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:49.499 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:49.499 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:49.499 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:49.499 20:44:41 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:49.499 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:49.499 20:44:41 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:49.500 20:44:41 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:49.500 #define SPDK_CONFIG_H 00:30:49.500 #define SPDK_CONFIG_APPS 1 00:30:49.500 #define SPDK_CONFIG_ARCH native 00:30:49.500 #undef SPDK_CONFIG_ASAN 00:30:49.500 #undef SPDK_CONFIG_AVAHI 00:30:49.500 #undef SPDK_CONFIG_CET 00:30:49.500 #define SPDK_CONFIG_COVERAGE 1 00:30:49.500 #define SPDK_CONFIG_CROSS_PREFIX 00:30:49.500 #define SPDK_CONFIG_CRYPTO 1 00:30:49.500 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:49.500 #undef SPDK_CONFIG_CUSTOMOCF 00:30:49.500 #undef SPDK_CONFIG_DAOS 00:30:49.500 #define SPDK_CONFIG_DAOS_DIR 00:30:49.500 #define SPDK_CONFIG_DEBUG 1 00:30:49.500 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:49.500 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:49.500 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:49.500 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:49.500 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:49.500 #undef SPDK_CONFIG_DPDK_UADK 00:30:49.500 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:49.500 #define SPDK_CONFIG_EXAMPLES 1 00:30:49.500 #undef SPDK_CONFIG_FC 00:30:49.500 #define SPDK_CONFIG_FC_PATH 00:30:49.500 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:49.500 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:49.500 #undef SPDK_CONFIG_FUSE 00:30:49.500 #undef SPDK_CONFIG_FUZZER 00:30:49.500 #define SPDK_CONFIG_FUZZER_LIB 00:30:49.500 #undef SPDK_CONFIG_GOLANG 00:30:49.500 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:49.500 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:49.500 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:49.500 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:49.500 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:49.500 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:49.500 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:49.500 #define SPDK_CONFIG_IDXD 1 00:30:49.500 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:49.500 #define SPDK_CONFIG_IPSEC_MB 1 00:30:49.500 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:49.500 #define SPDK_CONFIG_ISAL 1 00:30:49.500 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:49.500 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:49.500 #define SPDK_CONFIG_LIBDIR 00:30:49.500 #undef SPDK_CONFIG_LTO 00:30:49.500 #define SPDK_CONFIG_MAX_LCORES 128 00:30:49.500 #define SPDK_CONFIG_NVME_CUSE 1 00:30:49.500 #undef SPDK_CONFIG_OCF 00:30:49.500 #define SPDK_CONFIG_OCF_PATH 00:30:49.500 #define SPDK_CONFIG_OPENSSL_PATH 00:30:49.500 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:49.500 #define SPDK_CONFIG_PGO_DIR 00:30:49.500 #undef SPDK_CONFIG_PGO_USE 00:30:49.500 #define SPDK_CONFIG_PREFIX /usr/local 00:30:49.500 #undef SPDK_CONFIG_RAID5F 00:30:49.500 #undef SPDK_CONFIG_RBD 00:30:49.500 #define SPDK_CONFIG_RDMA 1 00:30:49.500 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:49.500 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:49.500 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:49.500 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:49.500 #define SPDK_CONFIG_SHARED 1 00:30:49.500 #undef SPDK_CONFIG_SMA 00:30:49.500 #define SPDK_CONFIG_TESTS 1 00:30:49.500 #undef SPDK_CONFIG_TSAN 00:30:49.500 #define SPDK_CONFIG_UBLK 1 00:30:49.500 #define SPDK_CONFIG_UBSAN 1 00:30:49.500 #undef SPDK_CONFIG_UNIT_TESTS 00:30:49.500 #undef SPDK_CONFIG_URING 00:30:49.500 #define SPDK_CONFIG_URING_PATH 00:30:49.500 #undef SPDK_CONFIG_URING_ZNS 00:30:49.500 #undef SPDK_CONFIG_USDT 00:30:49.500 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:49.500 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:49.500 #undef SPDK_CONFIG_VFIO_USER 00:30:49.500 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:49.500 #define SPDK_CONFIG_VHOST 1 00:30:49.500 #define SPDK_CONFIG_VIRTIO 1 00:30:49.500 #undef SPDK_CONFIG_VTUNE 00:30:49.500 #define SPDK_CONFIG_VTUNE_DIR 00:30:49.500 #define SPDK_CONFIG_WERROR 1 00:30:49.500 #define SPDK_CONFIG_WPDK_DIR 00:30:49.500 #undef SPDK_CONFIG_XNVME 00:30:49.500 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:49.500 20:44:41 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:49.500 20:44:41 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:49.500 20:44:41 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:49.500 20:44:41 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:49.500 20:44:41 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:49.500 20:44:41 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:49.500 20:44:41 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:49.500 20:44:41 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:30:49.500 20:44:41 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:49.500 20:44:41 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:30:49.500 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:49.501 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 1521446 ]] 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 1521446 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:49.502 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.RH8kLt 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.RH8kLt/tests/interrupt /tmp/spdk.RH8kLt 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88641318912 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5867196416 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892201984 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9502720 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253377024 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=880640 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:49.761 * Looking for test storage... 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88641318912 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=8081788928 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:49.761 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:49.761 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:49.762 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:49.762 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1521487 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1521487 /var/tmp/spdk.sock 00:30:49.762 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 1521487 ']' 00:30:49.762 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:49.762 20:44:41 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:49.762 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:49.762 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:49.762 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:49.762 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:49.762 20:44:41 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:49.762 [2024-07-15 20:44:41.954861] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:49.762 [2024-07-15 20:44:41.954937] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1521487 ] 00:30:49.762 [2024-07-15 20:44:42.083783] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:50.020 [2024-07-15 20:44:42.189822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:50.020 [2024-07-15 20:44:42.189931] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:50.020 [2024-07-15 20:44:42.189924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:50.020 [2024-07-15 20:44:42.261240] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:50.279 20:44:42 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:50.279 20:44:42 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:30:50.279 20:44:42 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:50.279 20:44:42 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:50.279 20:44:42 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:30:50.279 "name": "app_thread", 00:30:50.279 "id": 1, 00:30:50.279 "active_pollers": [], 00:30:50.279 "timed_pollers": [ 00:30:50.279 { 00:30:50.279 "name": "rpc_subsystem_poll_servers", 00:30:50.279 "id": 1, 00:30:50.279 "state": "waiting", 00:30:50.279 "run_count": 0, 00:30:50.279 "busy_count": 0, 00:30:50.279 "period_ticks": 9200000 00:30:50.279 } 00:30:50.279 ], 00:30:50.279 "paused_pollers": [] 00:30:50.279 }' 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:50.279 5000+0 records in 00:30:50.279 5000+0 records out 00:30:50.279 10240000 bytes (10 MB, 9.8 MiB) copied, 0.027194 s, 377 MB/s 00:30:50.279 20:44:42 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:50.538 AIO0 00:30:50.538 20:44:42 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:51.106 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:30:51.365 "name": "app_thread", 00:30:51.365 "id": 1, 00:30:51.365 "active_pollers": [], 00:30:51.365 "timed_pollers": [ 00:30:51.365 { 00:30:51.365 "name": "rpc_subsystem_poll_servers", 00:30:51.365 "id": 1, 00:30:51.365 "state": "waiting", 00:30:51.365 "run_count": 0, 00:30:51.365 "busy_count": 0, 00:30:51.365 "period_ticks": 9200000 00:30:51.365 } 00:30:51.365 ], 00:30:51.365 "paused_pollers": [] 00:30:51.365 }' 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:30:51.365 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1521487 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 1521487 ']' 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 1521487 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1521487 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1521487' 00:30:51.365 killing process with pid 1521487 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 1521487 00:30:51.365 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 1521487 00:30:51.624 20:44:43 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:30:51.624 20:44:43 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:51.624 00:30:51.624 real 0m2.246s 00:30:51.624 user 0m1.829s 00:30:51.624 sys 0m0.678s 00:30:51.624 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:51.624 20:44:43 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:51.624 ************************************ 00:30:51.624 END TEST reap_unregistered_poller 00:30:51.624 ************************************ 00:30:51.624 20:44:43 -- common/autotest_common.sh@1142 -- # return 0 00:30:51.624 20:44:43 -- spdk/autotest.sh@198 -- # uname -s 00:30:51.624 20:44:43 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:30:51.624 20:44:43 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:30:51.624 20:44:43 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:30:51.625 20:44:43 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:30:51.625 20:44:43 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:30:51.625 20:44:43 -- spdk/autotest.sh@260 -- # timing_exit lib 00:30:51.625 20:44:43 -- common/autotest_common.sh@728 -- # xtrace_disable 00:30:51.625 20:44:43 -- common/autotest_common.sh@10 -- # set +x 00:30:51.883 20:44:44 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:30:51.883 20:44:44 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:51.883 20:44:44 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:51.883 20:44:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:51.883 20:44:44 -- common/autotest_common.sh@10 -- # set +x 00:30:51.883 ************************************ 00:30:51.883 START TEST compress_compdev 00:30:51.883 ************************************ 00:30:51.883 20:44:44 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:51.883 * Looking for test storage... 00:30:51.883 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:51.883 20:44:44 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:51.883 20:44:44 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:30:51.883 20:44:44 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:51.884 20:44:44 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:51.884 20:44:44 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:51.884 20:44:44 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:51.884 20:44:44 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:51.884 20:44:44 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:51.884 20:44:44 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:51.884 20:44:44 compress_compdev -- paths/export.sh@5 -- # export PATH 00:30:51.884 20:44:44 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:51.884 20:44:44 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:51.884 20:44:44 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:51.884 20:44:44 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:51.884 20:44:44 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:30:51.884 20:44:44 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:51.884 20:44:44 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:51.884 20:44:44 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1521928 00:30:51.884 20:44:44 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:51.884 20:44:44 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:51.884 20:44:44 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1521928 00:30:51.884 20:44:44 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1521928 ']' 00:30:51.884 20:44:44 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:51.884 20:44:44 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:51.884 20:44:44 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:51.884 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:51.884 20:44:44 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:51.884 20:44:44 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:51.884 [2024-07-15 20:44:44.255669] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:30:51.884 [2024-07-15 20:44:44.255747] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1521928 ] 00:30:52.143 [2024-07-15 20:44:44.396063] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:52.143 [2024-07-15 20:44:44.513783] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:52.143 [2024-07-15 20:44:44.513789] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:53.522 [2024-07-15 20:44:45.468992] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:53.522 20:44:45 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:53.522 20:44:45 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:53.522 20:44:45 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:30:53.522 20:44:45 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:53.522 20:44:45 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:53.781 [2024-07-15 20:44:46.144455] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13403c0 PMD being used: compress_qat 00:30:54.039 20:44:46 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:54.039 20:44:46 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:54.039 20:44:46 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:54.039 20:44:46 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:54.039 20:44:46 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:54.039 20:44:46 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:54.039 20:44:46 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:54.298 20:44:46 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:54.298 [ 00:30:54.298 { 00:30:54.298 "name": "Nvme0n1", 00:30:54.298 "aliases": [ 00:30:54.298 "01000000-0000-0000-5cd2-e43197705251" 00:30:54.298 ], 00:30:54.298 "product_name": "NVMe disk", 00:30:54.298 "block_size": 512, 00:30:54.298 "num_blocks": 15002931888, 00:30:54.298 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:54.298 "assigned_rate_limits": { 00:30:54.298 "rw_ios_per_sec": 0, 00:30:54.298 "rw_mbytes_per_sec": 0, 00:30:54.298 "r_mbytes_per_sec": 0, 00:30:54.298 "w_mbytes_per_sec": 0 00:30:54.298 }, 00:30:54.298 "claimed": false, 00:30:54.298 "zoned": false, 00:30:54.298 "supported_io_types": { 00:30:54.298 "read": true, 00:30:54.298 "write": true, 00:30:54.298 "unmap": true, 00:30:54.298 "flush": true, 00:30:54.298 "reset": true, 00:30:54.298 "nvme_admin": true, 00:30:54.298 "nvme_io": true, 00:30:54.298 "nvme_io_md": false, 00:30:54.298 "write_zeroes": true, 00:30:54.298 "zcopy": false, 00:30:54.298 "get_zone_info": false, 00:30:54.298 "zone_management": false, 00:30:54.298 "zone_append": false, 00:30:54.298 "compare": false, 00:30:54.298 "compare_and_write": false, 00:30:54.298 "abort": true, 00:30:54.298 "seek_hole": false, 00:30:54.298 "seek_data": false, 00:30:54.298 "copy": false, 00:30:54.298 "nvme_iov_md": false 00:30:54.298 }, 00:30:54.298 "driver_specific": { 00:30:54.298 "nvme": [ 00:30:54.298 { 00:30:54.298 "pci_address": "0000:5e:00.0", 00:30:54.298 "trid": { 00:30:54.298 "trtype": "PCIe", 00:30:54.298 "traddr": "0000:5e:00.0" 00:30:54.298 }, 00:30:54.298 "ctrlr_data": { 00:30:54.298 "cntlid": 0, 00:30:54.298 "vendor_id": "0x8086", 00:30:54.298 "model_number": "INTEL SSDPF2KX076TZO", 00:30:54.298 "serial_number": "PHAC0301002G7P6CGN", 00:30:54.298 "firmware_revision": "JCV10200", 00:30:54.298 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:54.298 "oacs": { 00:30:54.298 "security": 1, 00:30:54.298 "format": 1, 00:30:54.298 "firmware": 1, 00:30:54.298 "ns_manage": 1 00:30:54.298 }, 00:30:54.298 "multi_ctrlr": false, 00:30:54.298 "ana_reporting": false 00:30:54.298 }, 00:30:54.298 "vs": { 00:30:54.298 "nvme_version": "1.3" 00:30:54.298 }, 00:30:54.298 "ns_data": { 00:30:54.298 "id": 1, 00:30:54.298 "can_share": false 00:30:54.298 }, 00:30:54.298 "security": { 00:30:54.298 "opal": true 00:30:54.298 } 00:30:54.298 } 00:30:54.298 ], 00:30:54.298 "mp_policy": "active_passive" 00:30:54.298 } 00:30:54.298 } 00:30:54.298 ] 00:30:54.298 20:44:46 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:54.298 20:44:46 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:54.557 [2024-07-15 20:44:46.894872] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x11a50d0 PMD being used: compress_qat 00:30:57.086 1e46e3e0-9aa2-4866-b5f1-d796494f2365 00:30:57.086 20:44:49 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:57.086 3c0d824c-7544-48cf-a559-34faeaaca763 00:30:57.086 20:44:49 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:57.086 20:44:49 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:57.086 20:44:49 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:57.086 20:44:49 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:57.086 20:44:49 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:57.086 20:44:49 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:57.086 20:44:49 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:57.345 20:44:49 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:57.603 [ 00:30:57.603 { 00:30:57.603 "name": "3c0d824c-7544-48cf-a559-34faeaaca763", 00:30:57.603 "aliases": [ 00:30:57.603 "lvs0/lv0" 00:30:57.603 ], 00:30:57.603 "product_name": "Logical Volume", 00:30:57.603 "block_size": 512, 00:30:57.603 "num_blocks": 204800, 00:30:57.603 "uuid": "3c0d824c-7544-48cf-a559-34faeaaca763", 00:30:57.603 "assigned_rate_limits": { 00:30:57.603 "rw_ios_per_sec": 0, 00:30:57.603 "rw_mbytes_per_sec": 0, 00:30:57.603 "r_mbytes_per_sec": 0, 00:30:57.603 "w_mbytes_per_sec": 0 00:30:57.603 }, 00:30:57.603 "claimed": false, 00:30:57.603 "zoned": false, 00:30:57.603 "supported_io_types": { 00:30:57.603 "read": true, 00:30:57.603 "write": true, 00:30:57.603 "unmap": true, 00:30:57.603 "flush": false, 00:30:57.603 "reset": true, 00:30:57.603 "nvme_admin": false, 00:30:57.603 "nvme_io": false, 00:30:57.603 "nvme_io_md": false, 00:30:57.603 "write_zeroes": true, 00:30:57.603 "zcopy": false, 00:30:57.603 "get_zone_info": false, 00:30:57.603 "zone_management": false, 00:30:57.603 "zone_append": false, 00:30:57.603 "compare": false, 00:30:57.603 "compare_and_write": false, 00:30:57.603 "abort": false, 00:30:57.603 "seek_hole": true, 00:30:57.603 "seek_data": true, 00:30:57.603 "copy": false, 00:30:57.603 "nvme_iov_md": false 00:30:57.603 }, 00:30:57.603 "driver_specific": { 00:30:57.603 "lvol": { 00:30:57.603 "lvol_store_uuid": "1e46e3e0-9aa2-4866-b5f1-d796494f2365", 00:30:57.603 "base_bdev": "Nvme0n1", 00:30:57.603 "thin_provision": true, 00:30:57.603 "num_allocated_clusters": 0, 00:30:57.603 "snapshot": false, 00:30:57.603 "clone": false, 00:30:57.603 "esnap_clone": false 00:30:57.603 } 00:30:57.603 } 00:30:57.603 } 00:30:57.603 ] 00:30:57.603 20:44:49 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:57.603 20:44:49 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:57.603 20:44:49 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:57.861 [2024-07-15 20:44:50.115343] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:57.861 COMP_lvs0/lv0 00:30:57.861 20:44:50 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:57.861 20:44:50 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:57.861 20:44:50 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:57.861 20:44:50 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:57.861 20:44:50 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:57.861 20:44:50 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:57.861 20:44:50 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:58.119 20:44:50 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:58.378 [ 00:30:58.378 { 00:30:58.378 "name": "COMP_lvs0/lv0", 00:30:58.378 "aliases": [ 00:30:58.378 "74032eef-cb92-57b3-9d14-35adbc53c25d" 00:30:58.378 ], 00:30:58.378 "product_name": "compress", 00:30:58.378 "block_size": 512, 00:30:58.378 "num_blocks": 200704, 00:30:58.378 "uuid": "74032eef-cb92-57b3-9d14-35adbc53c25d", 00:30:58.378 "assigned_rate_limits": { 00:30:58.378 "rw_ios_per_sec": 0, 00:30:58.378 "rw_mbytes_per_sec": 0, 00:30:58.378 "r_mbytes_per_sec": 0, 00:30:58.378 "w_mbytes_per_sec": 0 00:30:58.378 }, 00:30:58.378 "claimed": false, 00:30:58.378 "zoned": false, 00:30:58.378 "supported_io_types": { 00:30:58.378 "read": true, 00:30:58.378 "write": true, 00:30:58.378 "unmap": false, 00:30:58.378 "flush": false, 00:30:58.378 "reset": false, 00:30:58.378 "nvme_admin": false, 00:30:58.378 "nvme_io": false, 00:30:58.378 "nvme_io_md": false, 00:30:58.378 "write_zeroes": true, 00:30:58.378 "zcopy": false, 00:30:58.378 "get_zone_info": false, 00:30:58.378 "zone_management": false, 00:30:58.378 "zone_append": false, 00:30:58.378 "compare": false, 00:30:58.378 "compare_and_write": false, 00:30:58.378 "abort": false, 00:30:58.378 "seek_hole": false, 00:30:58.378 "seek_data": false, 00:30:58.378 "copy": false, 00:30:58.378 "nvme_iov_md": false 00:30:58.378 }, 00:30:58.378 "driver_specific": { 00:30:58.378 "compress": { 00:30:58.378 "name": "COMP_lvs0/lv0", 00:30:58.378 "base_bdev_name": "3c0d824c-7544-48cf-a559-34faeaaca763" 00:30:58.378 } 00:30:58.378 } 00:30:58.378 } 00:30:58.378 ] 00:30:58.378 20:44:50 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:58.378 20:44:50 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:58.636 [2024-07-15 20:44:50.785834] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fd0181b15c0 PMD being used: compress_qat 00:30:58.636 [2024-07-15 20:44:50.789027] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x133c770 PMD being used: compress_qat 00:30:58.636 Running I/O for 3 seconds... 00:31:01.921 00:31:01.921 Latency(us) 00:31:01.921 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:01.921 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:01.921 Verification LBA range: start 0x0 length 0x3100 00:31:01.921 COMP_lvs0/lv0 : 3.01 1681.76 6.57 0.00 0.00 18937.83 1909.09 16868.40 00:31:01.921 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:01.921 Verification LBA range: start 0x3100 length 0x3100 00:31:01.921 COMP_lvs0/lv0 : 3.01 1780.10 6.95 0.00 0.00 17861.96 1481.68 14816.83 00:31:01.921 =================================================================================================================== 00:31:01.921 Total : 3461.86 13.52 0.00 0.00 18384.61 1481.68 16868.40 00:31:01.921 0 00:31:01.921 20:44:53 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:01.921 20:44:53 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:02.179 20:44:54 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:02.437 20:44:54 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:02.437 20:44:54 compress_compdev -- compress/compress.sh@78 -- # killprocess 1521928 00:31:02.437 20:44:54 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1521928 ']' 00:31:02.437 20:44:54 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1521928 00:31:02.437 20:44:54 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:02.437 20:44:54 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:02.437 20:44:54 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1521928 00:31:02.437 20:44:54 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:02.437 20:44:54 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:02.437 20:44:54 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1521928' 00:31:02.437 killing process with pid 1521928 00:31:02.437 20:44:54 compress_compdev -- common/autotest_common.sh@967 -- # kill 1521928 00:31:02.437 Received shutdown signal, test time was about 3.000000 seconds 00:31:02.437 00:31:02.437 Latency(us) 00:31:02.437 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:02.437 =================================================================================================================== 00:31:02.437 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:02.437 20:44:54 compress_compdev -- common/autotest_common.sh@972 -- # wait 1521928 00:31:05.761 20:44:57 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:05.761 20:44:57 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:05.761 20:44:57 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1523696 00:31:05.761 20:44:57 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:05.761 20:44:57 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:05.761 20:44:57 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1523696 00:31:05.761 20:44:57 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1523696 ']' 00:31:05.761 20:44:57 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:05.761 20:44:57 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:05.761 20:44:57 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:05.761 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:05.761 20:44:57 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:05.761 20:44:57 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:05.761 [2024-07-15 20:44:57.829788] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:31:05.761 [2024-07-15 20:44:57.829940] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1523696 ] 00:31:05.761 [2024-07-15 20:44:58.032023] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:06.020 [2024-07-15 20:44:58.177505] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:06.020 [2024-07-15 20:44:58.177514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:06.955 [2024-07-15 20:44:59.168528] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:06.955 20:44:59 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:06.955 20:44:59 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:31:06.955 20:44:59 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:31:06.955 20:44:59 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:06.955 20:44:59 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:07.891 [2024-07-15 20:45:00.104279] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b0a3c0 PMD being used: compress_qat 00:31:07.891 20:45:00 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:07.891 20:45:00 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:07.891 20:45:00 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:07.891 20:45:00 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:07.891 20:45:00 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:07.891 20:45:00 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:07.891 20:45:00 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:08.150 20:45:00 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:08.408 [ 00:31:08.408 { 00:31:08.408 "name": "Nvme0n1", 00:31:08.408 "aliases": [ 00:31:08.408 "01000000-0000-0000-5cd2-e43197705251" 00:31:08.408 ], 00:31:08.408 "product_name": "NVMe disk", 00:31:08.408 "block_size": 512, 00:31:08.408 "num_blocks": 15002931888, 00:31:08.408 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:08.408 "assigned_rate_limits": { 00:31:08.408 "rw_ios_per_sec": 0, 00:31:08.408 "rw_mbytes_per_sec": 0, 00:31:08.408 "r_mbytes_per_sec": 0, 00:31:08.408 "w_mbytes_per_sec": 0 00:31:08.408 }, 00:31:08.408 "claimed": false, 00:31:08.408 "zoned": false, 00:31:08.408 "supported_io_types": { 00:31:08.408 "read": true, 00:31:08.408 "write": true, 00:31:08.408 "unmap": true, 00:31:08.408 "flush": true, 00:31:08.408 "reset": true, 00:31:08.408 "nvme_admin": true, 00:31:08.408 "nvme_io": true, 00:31:08.408 "nvme_io_md": false, 00:31:08.408 "write_zeroes": true, 00:31:08.408 "zcopy": false, 00:31:08.408 "get_zone_info": false, 00:31:08.408 "zone_management": false, 00:31:08.408 "zone_append": false, 00:31:08.408 "compare": false, 00:31:08.408 "compare_and_write": false, 00:31:08.408 "abort": true, 00:31:08.408 "seek_hole": false, 00:31:08.408 "seek_data": false, 00:31:08.408 "copy": false, 00:31:08.408 "nvme_iov_md": false 00:31:08.408 }, 00:31:08.408 "driver_specific": { 00:31:08.408 "nvme": [ 00:31:08.408 { 00:31:08.408 "pci_address": "0000:5e:00.0", 00:31:08.408 "trid": { 00:31:08.408 "trtype": "PCIe", 00:31:08.408 "traddr": "0000:5e:00.0" 00:31:08.408 }, 00:31:08.408 "ctrlr_data": { 00:31:08.408 "cntlid": 0, 00:31:08.408 "vendor_id": "0x8086", 00:31:08.408 "model_number": "INTEL SSDPF2KX076TZO", 00:31:08.408 "serial_number": "PHAC0301002G7P6CGN", 00:31:08.408 "firmware_revision": "JCV10200", 00:31:08.408 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:08.408 "oacs": { 00:31:08.408 "security": 1, 00:31:08.408 "format": 1, 00:31:08.408 "firmware": 1, 00:31:08.408 "ns_manage": 1 00:31:08.408 }, 00:31:08.408 "multi_ctrlr": false, 00:31:08.408 "ana_reporting": false 00:31:08.408 }, 00:31:08.408 "vs": { 00:31:08.408 "nvme_version": "1.3" 00:31:08.408 }, 00:31:08.408 "ns_data": { 00:31:08.408 "id": 1, 00:31:08.408 "can_share": false 00:31:08.408 }, 00:31:08.408 "security": { 00:31:08.408 "opal": true 00:31:08.408 } 00:31:08.408 } 00:31:08.408 ], 00:31:08.408 "mp_policy": "active_passive" 00:31:08.408 } 00:31:08.408 } 00:31:08.408 ] 00:31:08.408 20:45:00 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:08.408 20:45:00 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:08.666 [2024-07-15 20:45:00.882808] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x196f0d0 PMD being used: compress_qat 00:31:11.198 15a08cef-53cc-4e3d-be4e-ab499facd28c 00:31:11.198 20:45:03 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:11.456 59cc4456-4aad-4227-9572-af1121486dc2 00:31:11.456 20:45:03 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:11.456 20:45:03 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:11.456 20:45:03 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:11.456 20:45:03 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:11.456 20:45:03 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:11.456 20:45:03 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:11.456 20:45:03 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:11.714 20:45:03 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:11.972 [ 00:31:11.972 { 00:31:11.972 "name": "59cc4456-4aad-4227-9572-af1121486dc2", 00:31:11.972 "aliases": [ 00:31:11.972 "lvs0/lv0" 00:31:11.972 ], 00:31:11.972 "product_name": "Logical Volume", 00:31:11.972 "block_size": 512, 00:31:11.972 "num_blocks": 204800, 00:31:11.972 "uuid": "59cc4456-4aad-4227-9572-af1121486dc2", 00:31:11.972 "assigned_rate_limits": { 00:31:11.972 "rw_ios_per_sec": 0, 00:31:11.972 "rw_mbytes_per_sec": 0, 00:31:11.972 "r_mbytes_per_sec": 0, 00:31:11.972 "w_mbytes_per_sec": 0 00:31:11.972 }, 00:31:11.972 "claimed": false, 00:31:11.972 "zoned": false, 00:31:11.972 "supported_io_types": { 00:31:11.972 "read": true, 00:31:11.972 "write": true, 00:31:11.972 "unmap": true, 00:31:11.972 "flush": false, 00:31:11.972 "reset": true, 00:31:11.972 "nvme_admin": false, 00:31:11.972 "nvme_io": false, 00:31:11.972 "nvme_io_md": false, 00:31:11.972 "write_zeroes": true, 00:31:11.972 "zcopy": false, 00:31:11.972 "get_zone_info": false, 00:31:11.972 "zone_management": false, 00:31:11.972 "zone_append": false, 00:31:11.972 "compare": false, 00:31:11.972 "compare_and_write": false, 00:31:11.972 "abort": false, 00:31:11.972 "seek_hole": true, 00:31:11.972 "seek_data": true, 00:31:11.972 "copy": false, 00:31:11.973 "nvme_iov_md": false 00:31:11.973 }, 00:31:11.973 "driver_specific": { 00:31:11.973 "lvol": { 00:31:11.973 "lvol_store_uuid": "15a08cef-53cc-4e3d-be4e-ab499facd28c", 00:31:11.973 "base_bdev": "Nvme0n1", 00:31:11.973 "thin_provision": true, 00:31:11.973 "num_allocated_clusters": 0, 00:31:11.973 "snapshot": false, 00:31:11.973 "clone": false, 00:31:11.973 "esnap_clone": false 00:31:11.973 } 00:31:11.973 } 00:31:11.973 } 00:31:11.973 ] 00:31:11.973 20:45:04 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:11.973 20:45:04 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:11.973 20:45:04 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:11.973 [2024-07-15 20:45:04.324300] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:11.973 COMP_lvs0/lv0 00:31:12.231 20:45:04 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:12.231 20:45:04 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:12.231 20:45:04 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:12.231 20:45:04 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:12.231 20:45:04 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:12.231 20:45:04 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:12.231 20:45:04 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:12.231 20:45:04 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:12.798 [ 00:31:12.799 { 00:31:12.799 "name": "COMP_lvs0/lv0", 00:31:12.799 "aliases": [ 00:31:12.799 "34a5b845-e354-595c-b45e-9b596346a59e" 00:31:12.799 ], 00:31:12.799 "product_name": "compress", 00:31:12.799 "block_size": 512, 00:31:12.799 "num_blocks": 200704, 00:31:12.799 "uuid": "34a5b845-e354-595c-b45e-9b596346a59e", 00:31:12.799 "assigned_rate_limits": { 00:31:12.799 "rw_ios_per_sec": 0, 00:31:12.799 "rw_mbytes_per_sec": 0, 00:31:12.799 "r_mbytes_per_sec": 0, 00:31:12.799 "w_mbytes_per_sec": 0 00:31:12.799 }, 00:31:12.799 "claimed": false, 00:31:12.799 "zoned": false, 00:31:12.799 "supported_io_types": { 00:31:12.799 "read": true, 00:31:12.799 "write": true, 00:31:12.799 "unmap": false, 00:31:12.799 "flush": false, 00:31:12.799 "reset": false, 00:31:12.799 "nvme_admin": false, 00:31:12.799 "nvme_io": false, 00:31:12.799 "nvme_io_md": false, 00:31:12.799 "write_zeroes": true, 00:31:12.799 "zcopy": false, 00:31:12.799 "get_zone_info": false, 00:31:12.799 "zone_management": false, 00:31:12.799 "zone_append": false, 00:31:12.799 "compare": false, 00:31:12.799 "compare_and_write": false, 00:31:12.799 "abort": false, 00:31:12.799 "seek_hole": false, 00:31:12.799 "seek_data": false, 00:31:12.799 "copy": false, 00:31:12.799 "nvme_iov_md": false 00:31:12.799 }, 00:31:12.799 "driver_specific": { 00:31:12.799 "compress": { 00:31:12.799 "name": "COMP_lvs0/lv0", 00:31:12.799 "base_bdev_name": "59cc4456-4aad-4227-9572-af1121486dc2" 00:31:12.799 } 00:31:12.799 } 00:31:12.799 } 00:31:12.799 ] 00:31:12.799 20:45:05 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:12.799 20:45:05 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:13.058 [2024-07-15 20:45:05.344760] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f07481b15c0 PMD being used: compress_qat 00:31:13.058 [2024-07-15 20:45:05.348246] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b06900 PMD being used: compress_qat 00:31:13.058 Running I/O for 3 seconds... 00:31:16.337 00:31:16.337 Latency(us) 00:31:16.337 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:16.337 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:16.338 Verification LBA range: start 0x0 length 0x3100 00:31:16.338 COMP_lvs0/lv0 : 3.01 1674.39 6.54 0.00 0.00 19025.02 1560.04 16868.40 00:31:16.338 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:16.338 Verification LBA range: start 0x3100 length 0x3100 00:31:16.338 COMP_lvs0/lv0 : 3.01 1773.38 6.93 0.00 0.00 17929.37 1367.71 14930.81 00:31:16.338 =================================================================================================================== 00:31:16.338 Total : 3447.76 13.47 0.00 0.00 18461.46 1367.71 16868.40 00:31:16.338 0 00:31:16.338 20:45:08 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:16.338 20:45:08 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:16.338 20:45:08 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:16.904 20:45:09 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:16.904 20:45:09 compress_compdev -- compress/compress.sh@78 -- # killprocess 1523696 00:31:16.904 20:45:09 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1523696 ']' 00:31:16.904 20:45:09 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1523696 00:31:16.904 20:45:09 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:16.904 20:45:09 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:16.904 20:45:09 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1523696 00:31:16.904 20:45:09 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:16.904 20:45:09 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:16.904 20:45:09 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1523696' 00:31:16.904 killing process with pid 1523696 00:31:16.904 20:45:09 compress_compdev -- common/autotest_common.sh@967 -- # kill 1523696 00:31:16.904 Received shutdown signal, test time was about 3.000000 seconds 00:31:16.904 00:31:16.904 Latency(us) 00:31:16.904 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:16.904 =================================================================================================================== 00:31:16.904 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:16.904 20:45:09 compress_compdev -- common/autotest_common.sh@972 -- # wait 1523696 00:31:20.188 20:45:12 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:31:20.188 20:45:12 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:20.188 20:45:12 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1525977 00:31:20.188 20:45:12 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:20.188 20:45:12 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:20.188 20:45:12 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1525977 00:31:20.188 20:45:12 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1525977 ']' 00:31:20.188 20:45:12 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:20.188 20:45:12 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:20.188 20:45:12 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:20.188 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:20.188 20:45:12 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:20.188 20:45:12 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:20.188 [2024-07-15 20:45:12.413425] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:31:20.188 [2024-07-15 20:45:12.413566] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1525977 ] 00:31:20.447 [2024-07-15 20:45:12.616396] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:20.447 [2024-07-15 20:45:12.749758] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:20.447 [2024-07-15 20:45:12.749763] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:21.380 [2024-07-15 20:45:13.718853] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:21.638 20:45:13 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:21.638 20:45:13 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:31:21.638 20:45:13 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:31:21.638 20:45:13 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:21.638 20:45:13 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:22.205 [2024-07-15 20:45:14.408603] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x259e3c0 PMD being used: compress_qat 00:31:22.205 20:45:14 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:22.205 20:45:14 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:22.205 20:45:14 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:22.205 20:45:14 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:22.205 20:45:14 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:22.205 20:45:14 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:22.205 20:45:14 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:22.464 20:45:14 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:22.722 [ 00:31:22.722 { 00:31:22.722 "name": "Nvme0n1", 00:31:22.722 "aliases": [ 00:31:22.722 "01000000-0000-0000-5cd2-e43197705251" 00:31:22.722 ], 00:31:22.722 "product_name": "NVMe disk", 00:31:22.722 "block_size": 512, 00:31:22.722 "num_blocks": 15002931888, 00:31:22.722 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:22.722 "assigned_rate_limits": { 00:31:22.722 "rw_ios_per_sec": 0, 00:31:22.722 "rw_mbytes_per_sec": 0, 00:31:22.722 "r_mbytes_per_sec": 0, 00:31:22.722 "w_mbytes_per_sec": 0 00:31:22.722 }, 00:31:22.722 "claimed": false, 00:31:22.722 "zoned": false, 00:31:22.722 "supported_io_types": { 00:31:22.722 "read": true, 00:31:22.722 "write": true, 00:31:22.722 "unmap": true, 00:31:22.722 "flush": true, 00:31:22.722 "reset": true, 00:31:22.722 "nvme_admin": true, 00:31:22.722 "nvme_io": true, 00:31:22.722 "nvme_io_md": false, 00:31:22.722 "write_zeroes": true, 00:31:22.722 "zcopy": false, 00:31:22.722 "get_zone_info": false, 00:31:22.722 "zone_management": false, 00:31:22.722 "zone_append": false, 00:31:22.722 "compare": false, 00:31:22.722 "compare_and_write": false, 00:31:22.722 "abort": true, 00:31:22.722 "seek_hole": false, 00:31:22.722 "seek_data": false, 00:31:22.722 "copy": false, 00:31:22.722 "nvme_iov_md": false 00:31:22.722 }, 00:31:22.722 "driver_specific": { 00:31:22.722 "nvme": [ 00:31:22.722 { 00:31:22.722 "pci_address": "0000:5e:00.0", 00:31:22.722 "trid": { 00:31:22.722 "trtype": "PCIe", 00:31:22.722 "traddr": "0000:5e:00.0" 00:31:22.722 }, 00:31:22.722 "ctrlr_data": { 00:31:22.722 "cntlid": 0, 00:31:22.722 "vendor_id": "0x8086", 00:31:22.722 "model_number": "INTEL SSDPF2KX076TZO", 00:31:22.722 "serial_number": "PHAC0301002G7P6CGN", 00:31:22.722 "firmware_revision": "JCV10200", 00:31:22.722 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:22.722 "oacs": { 00:31:22.722 "security": 1, 00:31:22.722 "format": 1, 00:31:22.722 "firmware": 1, 00:31:22.722 "ns_manage": 1 00:31:22.722 }, 00:31:22.722 "multi_ctrlr": false, 00:31:22.722 "ana_reporting": false 00:31:22.722 }, 00:31:22.722 "vs": { 00:31:22.722 "nvme_version": "1.3" 00:31:22.722 }, 00:31:22.722 "ns_data": { 00:31:22.722 "id": 1, 00:31:22.723 "can_share": false 00:31:22.723 }, 00:31:22.723 "security": { 00:31:22.723 "opal": true 00:31:22.723 } 00:31:22.723 } 00:31:22.723 ], 00:31:22.723 "mp_policy": "active_passive" 00:31:22.723 } 00:31:22.723 } 00:31:22.723 ] 00:31:22.723 20:45:14 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:22.723 20:45:14 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:22.980 [2024-07-15 20:45:15.131410] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24030d0 PMD being used: compress_qat 00:31:24.956 a1186373-3992-49b4-8fbd-d4095e54fe59 00:31:25.215 20:45:17 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:25.215 1ce007c8-ab00-4731-938f-a660a025f73f 00:31:25.215 20:45:17 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:25.215 20:45:17 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:25.215 20:45:17 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:25.215 20:45:17 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:25.215 20:45:17 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:25.215 20:45:17 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:25.215 20:45:17 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:25.473 20:45:17 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:25.730 [ 00:31:25.730 { 00:31:25.730 "name": "1ce007c8-ab00-4731-938f-a660a025f73f", 00:31:25.730 "aliases": [ 00:31:25.730 "lvs0/lv0" 00:31:25.730 ], 00:31:25.730 "product_name": "Logical Volume", 00:31:25.730 "block_size": 512, 00:31:25.730 "num_blocks": 204800, 00:31:25.730 "uuid": "1ce007c8-ab00-4731-938f-a660a025f73f", 00:31:25.730 "assigned_rate_limits": { 00:31:25.730 "rw_ios_per_sec": 0, 00:31:25.730 "rw_mbytes_per_sec": 0, 00:31:25.730 "r_mbytes_per_sec": 0, 00:31:25.730 "w_mbytes_per_sec": 0 00:31:25.730 }, 00:31:25.730 "claimed": false, 00:31:25.730 "zoned": false, 00:31:25.730 "supported_io_types": { 00:31:25.730 "read": true, 00:31:25.730 "write": true, 00:31:25.730 "unmap": true, 00:31:25.730 "flush": false, 00:31:25.730 "reset": true, 00:31:25.730 "nvme_admin": false, 00:31:25.730 "nvme_io": false, 00:31:25.730 "nvme_io_md": false, 00:31:25.730 "write_zeroes": true, 00:31:25.730 "zcopy": false, 00:31:25.730 "get_zone_info": false, 00:31:25.730 "zone_management": false, 00:31:25.730 "zone_append": false, 00:31:25.730 "compare": false, 00:31:25.730 "compare_and_write": false, 00:31:25.730 "abort": false, 00:31:25.730 "seek_hole": true, 00:31:25.730 "seek_data": true, 00:31:25.730 "copy": false, 00:31:25.730 "nvme_iov_md": false 00:31:25.730 }, 00:31:25.730 "driver_specific": { 00:31:25.730 "lvol": { 00:31:25.730 "lvol_store_uuid": "a1186373-3992-49b4-8fbd-d4095e54fe59", 00:31:25.730 "base_bdev": "Nvme0n1", 00:31:25.730 "thin_provision": true, 00:31:25.730 "num_allocated_clusters": 0, 00:31:25.730 "snapshot": false, 00:31:25.730 "clone": false, 00:31:25.730 "esnap_clone": false 00:31:25.730 } 00:31:25.730 } 00:31:25.730 } 00:31:25.730 ] 00:31:25.730 20:45:18 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:25.730 20:45:18 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:31:25.730 20:45:18 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:31:26.296 [2024-07-15 20:45:18.525318] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:26.296 COMP_lvs0/lv0 00:31:26.296 20:45:18 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:26.296 20:45:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:26.296 20:45:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:26.296 20:45:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:26.296 20:45:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:26.296 20:45:18 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:26.296 20:45:18 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:26.554 20:45:18 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:26.811 [ 00:31:26.811 { 00:31:26.811 "name": "COMP_lvs0/lv0", 00:31:26.811 "aliases": [ 00:31:26.811 "b5e35888-a7ac-5fb3-a566-1d9966fd95af" 00:31:26.811 ], 00:31:26.811 "product_name": "compress", 00:31:26.811 "block_size": 4096, 00:31:26.811 "num_blocks": 25088, 00:31:26.811 "uuid": "b5e35888-a7ac-5fb3-a566-1d9966fd95af", 00:31:26.811 "assigned_rate_limits": { 00:31:26.811 "rw_ios_per_sec": 0, 00:31:26.811 "rw_mbytes_per_sec": 0, 00:31:26.811 "r_mbytes_per_sec": 0, 00:31:26.811 "w_mbytes_per_sec": 0 00:31:26.811 }, 00:31:26.811 "claimed": false, 00:31:26.811 "zoned": false, 00:31:26.811 "supported_io_types": { 00:31:26.811 "read": true, 00:31:26.811 "write": true, 00:31:26.811 "unmap": false, 00:31:26.811 "flush": false, 00:31:26.811 "reset": false, 00:31:26.811 "nvme_admin": false, 00:31:26.811 "nvme_io": false, 00:31:26.811 "nvme_io_md": false, 00:31:26.811 "write_zeroes": true, 00:31:26.811 "zcopy": false, 00:31:26.811 "get_zone_info": false, 00:31:26.811 "zone_management": false, 00:31:26.811 "zone_append": false, 00:31:26.811 "compare": false, 00:31:26.811 "compare_and_write": false, 00:31:26.811 "abort": false, 00:31:26.811 "seek_hole": false, 00:31:26.811 "seek_data": false, 00:31:26.811 "copy": false, 00:31:26.811 "nvme_iov_md": false 00:31:26.811 }, 00:31:26.811 "driver_specific": { 00:31:26.811 "compress": { 00:31:26.811 "name": "COMP_lvs0/lv0", 00:31:26.811 "base_bdev_name": "1ce007c8-ab00-4731-938f-a660a025f73f" 00:31:26.811 } 00:31:26.811 } 00:31:26.811 } 00:31:26.811 ] 00:31:26.811 20:45:18 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:26.811 20:45:18 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:26.811 [2024-07-15 20:45:19.105304] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x259b030 PMD being used: compress_qat 00:31:26.811 [2024-07-15 20:45:19.109387] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f69b419bc10 PMD being used: compress_qat 00:31:26.811 Running I/O for 3 seconds... 00:31:30.096 00:31:30.096 Latency(us) 00:31:30.096 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:30.096 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:30.096 Verification LBA range: start 0x0 length 0x3100 00:31:30.096 COMP_lvs0/lv0 : 3.01 2807.37 10.97 0.00 0.00 11312.71 968.79 10029.86 00:31:30.096 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:30.096 Verification LBA range: start 0x3100 length 0x3100 00:31:30.096 COMP_lvs0/lv0 : 3.01 2631.25 10.28 0.00 0.00 12027.88 1111.26 11340.58 00:31:30.096 =================================================================================================================== 00:31:30.096 Total : 5438.62 21.24 0.00 0.00 11658.80 968.79 11340.58 00:31:30.096 0 00:31:30.096 20:45:22 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:30.096 20:45:22 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:30.096 20:45:22 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:30.663 20:45:22 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:30.663 20:45:22 compress_compdev -- compress/compress.sh@78 -- # killprocess 1525977 00:31:30.663 20:45:22 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1525977 ']' 00:31:30.663 20:45:22 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1525977 00:31:30.663 20:45:22 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:30.663 20:45:22 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:30.663 20:45:22 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1525977 00:31:30.663 20:45:22 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:30.663 20:45:22 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:30.663 20:45:22 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1525977' 00:31:30.663 killing process with pid 1525977 00:31:30.663 20:45:22 compress_compdev -- common/autotest_common.sh@967 -- # kill 1525977 00:31:30.663 Received shutdown signal, test time was about 3.000000 seconds 00:31:30.663 00:31:30.663 Latency(us) 00:31:30.663 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:30.663 =================================================================================================================== 00:31:30.663 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:30.663 20:45:22 compress_compdev -- common/autotest_common.sh@972 -- # wait 1525977 00:31:33.947 20:45:25 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:31:33.947 20:45:25 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:33.947 20:45:26 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1527753 00:31:33.947 20:45:26 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:33.947 20:45:26 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:31:33.947 20:45:26 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1527753 00:31:33.947 20:45:26 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1527753 ']' 00:31:33.947 20:45:26 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:33.947 20:45:26 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:33.947 20:45:26 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:33.947 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:33.947 20:45:26 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:33.947 20:45:26 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:33.947 [2024-07-15 20:45:26.105662] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:31:33.947 [2024-07-15 20:45:26.105801] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1527753 ] 00:31:33.947 [2024-07-15 20:45:26.301680] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:34.205 [2024-07-15 20:45:26.406431] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:34.205 [2024-07-15 20:45:26.406532] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:34.205 [2024-07-15 20:45:26.406533] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:35.140 [2024-07-15 20:45:27.162520] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:35.140 20:45:27 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:35.140 20:45:27 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:31:35.140 20:45:27 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:31:35.140 20:45:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:35.140 20:45:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:35.713 [2024-07-15 20:45:27.815993] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1615f20 PMD being used: compress_qat 00:31:35.713 20:45:27 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:35.713 20:45:27 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:35.713 20:45:27 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:35.713 20:45:27 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:35.713 20:45:27 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:35.713 20:45:27 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:35.713 20:45:27 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:35.975 20:45:28 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:35.975 [ 00:31:35.975 { 00:31:35.975 "name": "Nvme0n1", 00:31:35.975 "aliases": [ 00:31:35.975 "01000000-0000-0000-5cd2-e43197705251" 00:31:35.975 ], 00:31:35.975 "product_name": "NVMe disk", 00:31:35.975 "block_size": 512, 00:31:35.975 "num_blocks": 15002931888, 00:31:35.975 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:35.975 "assigned_rate_limits": { 00:31:35.975 "rw_ios_per_sec": 0, 00:31:35.975 "rw_mbytes_per_sec": 0, 00:31:35.975 "r_mbytes_per_sec": 0, 00:31:35.975 "w_mbytes_per_sec": 0 00:31:35.975 }, 00:31:35.975 "claimed": false, 00:31:35.975 "zoned": false, 00:31:35.975 "supported_io_types": { 00:31:35.975 "read": true, 00:31:35.975 "write": true, 00:31:35.975 "unmap": true, 00:31:35.975 "flush": true, 00:31:35.975 "reset": true, 00:31:35.975 "nvme_admin": true, 00:31:35.975 "nvme_io": true, 00:31:35.975 "nvme_io_md": false, 00:31:35.975 "write_zeroes": true, 00:31:35.975 "zcopy": false, 00:31:35.975 "get_zone_info": false, 00:31:35.975 "zone_management": false, 00:31:35.975 "zone_append": false, 00:31:35.975 "compare": false, 00:31:35.975 "compare_and_write": false, 00:31:35.975 "abort": true, 00:31:35.975 "seek_hole": false, 00:31:35.975 "seek_data": false, 00:31:35.975 "copy": false, 00:31:35.975 "nvme_iov_md": false 00:31:35.975 }, 00:31:35.975 "driver_specific": { 00:31:35.975 "nvme": [ 00:31:35.975 { 00:31:35.975 "pci_address": "0000:5e:00.0", 00:31:35.975 "trid": { 00:31:35.975 "trtype": "PCIe", 00:31:35.975 "traddr": "0000:5e:00.0" 00:31:35.975 }, 00:31:35.975 "ctrlr_data": { 00:31:35.975 "cntlid": 0, 00:31:35.975 "vendor_id": "0x8086", 00:31:35.975 "model_number": "INTEL SSDPF2KX076TZO", 00:31:35.975 "serial_number": "PHAC0301002G7P6CGN", 00:31:35.975 "firmware_revision": "JCV10200", 00:31:35.975 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:35.975 "oacs": { 00:31:35.975 "security": 1, 00:31:35.975 "format": 1, 00:31:35.975 "firmware": 1, 00:31:35.975 "ns_manage": 1 00:31:35.975 }, 00:31:35.975 "multi_ctrlr": false, 00:31:35.975 "ana_reporting": false 00:31:35.975 }, 00:31:35.975 "vs": { 00:31:35.975 "nvme_version": "1.3" 00:31:35.975 }, 00:31:35.975 "ns_data": { 00:31:35.975 "id": 1, 00:31:35.975 "can_share": false 00:31:35.975 }, 00:31:35.975 "security": { 00:31:35.975 "opal": true 00:31:35.975 } 00:31:35.975 } 00:31:35.975 ], 00:31:35.975 "mp_policy": "active_passive" 00:31:35.975 } 00:31:35.975 } 00:31:35.975 ] 00:31:36.232 20:45:28 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:36.232 20:45:28 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:36.232 [2024-07-15 20:45:28.593656] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x144ddf0 PMD being used: compress_qat 00:31:38.759 0df59223-029c-4b07-ba6c-430bb5456808 00:31:38.759 20:45:30 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:38.759 215a0c22-7aa0-4716-9d8c-ff607fd6257c 00:31:38.759 20:45:31 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:38.759 20:45:31 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:38.759 20:45:31 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:38.759 20:45:31 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:38.759 20:45:31 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:38.759 20:45:31 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:38.759 20:45:31 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:39.017 20:45:31 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:39.275 [ 00:31:39.275 { 00:31:39.275 "name": "215a0c22-7aa0-4716-9d8c-ff607fd6257c", 00:31:39.275 "aliases": [ 00:31:39.275 "lvs0/lv0" 00:31:39.275 ], 00:31:39.275 "product_name": "Logical Volume", 00:31:39.275 "block_size": 512, 00:31:39.275 "num_blocks": 204800, 00:31:39.275 "uuid": "215a0c22-7aa0-4716-9d8c-ff607fd6257c", 00:31:39.275 "assigned_rate_limits": { 00:31:39.275 "rw_ios_per_sec": 0, 00:31:39.275 "rw_mbytes_per_sec": 0, 00:31:39.275 "r_mbytes_per_sec": 0, 00:31:39.275 "w_mbytes_per_sec": 0 00:31:39.275 }, 00:31:39.275 "claimed": false, 00:31:39.275 "zoned": false, 00:31:39.275 "supported_io_types": { 00:31:39.275 "read": true, 00:31:39.275 "write": true, 00:31:39.275 "unmap": true, 00:31:39.275 "flush": false, 00:31:39.275 "reset": true, 00:31:39.275 "nvme_admin": false, 00:31:39.275 "nvme_io": false, 00:31:39.275 "nvme_io_md": false, 00:31:39.275 "write_zeroes": true, 00:31:39.275 "zcopy": false, 00:31:39.275 "get_zone_info": false, 00:31:39.275 "zone_management": false, 00:31:39.275 "zone_append": false, 00:31:39.275 "compare": false, 00:31:39.275 "compare_and_write": false, 00:31:39.275 "abort": false, 00:31:39.275 "seek_hole": true, 00:31:39.275 "seek_data": true, 00:31:39.275 "copy": false, 00:31:39.275 "nvme_iov_md": false 00:31:39.275 }, 00:31:39.275 "driver_specific": { 00:31:39.275 "lvol": { 00:31:39.275 "lvol_store_uuid": "0df59223-029c-4b07-ba6c-430bb5456808", 00:31:39.275 "base_bdev": "Nvme0n1", 00:31:39.275 "thin_provision": true, 00:31:39.275 "num_allocated_clusters": 0, 00:31:39.275 "snapshot": false, 00:31:39.275 "clone": false, 00:31:39.275 "esnap_clone": false 00:31:39.275 } 00:31:39.275 } 00:31:39.275 } 00:31:39.275 ] 00:31:39.275 20:45:31 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:39.275 20:45:31 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:39.275 20:45:31 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:39.533 [2024-07-15 20:45:31.822423] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:39.533 COMP_lvs0/lv0 00:31:39.533 20:45:31 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:39.533 20:45:31 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:39.533 20:45:31 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:39.533 20:45:31 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:39.533 20:45:31 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:39.533 20:45:31 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:39.533 20:45:31 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:39.791 20:45:32 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:40.049 [ 00:31:40.049 { 00:31:40.049 "name": "COMP_lvs0/lv0", 00:31:40.049 "aliases": [ 00:31:40.049 "52cf6601-1770-51fc-a1bc-a7ad8b16c531" 00:31:40.049 ], 00:31:40.049 "product_name": "compress", 00:31:40.049 "block_size": 512, 00:31:40.049 "num_blocks": 200704, 00:31:40.049 "uuid": "52cf6601-1770-51fc-a1bc-a7ad8b16c531", 00:31:40.049 "assigned_rate_limits": { 00:31:40.049 "rw_ios_per_sec": 0, 00:31:40.049 "rw_mbytes_per_sec": 0, 00:31:40.049 "r_mbytes_per_sec": 0, 00:31:40.049 "w_mbytes_per_sec": 0 00:31:40.049 }, 00:31:40.049 "claimed": false, 00:31:40.049 "zoned": false, 00:31:40.049 "supported_io_types": { 00:31:40.049 "read": true, 00:31:40.049 "write": true, 00:31:40.049 "unmap": false, 00:31:40.049 "flush": false, 00:31:40.049 "reset": false, 00:31:40.049 "nvme_admin": false, 00:31:40.049 "nvme_io": false, 00:31:40.049 "nvme_io_md": false, 00:31:40.049 "write_zeroes": true, 00:31:40.049 "zcopy": false, 00:31:40.049 "get_zone_info": false, 00:31:40.049 "zone_management": false, 00:31:40.049 "zone_append": false, 00:31:40.049 "compare": false, 00:31:40.049 "compare_and_write": false, 00:31:40.049 "abort": false, 00:31:40.049 "seek_hole": false, 00:31:40.049 "seek_data": false, 00:31:40.049 "copy": false, 00:31:40.049 "nvme_iov_md": false 00:31:40.049 }, 00:31:40.049 "driver_specific": { 00:31:40.049 "compress": { 00:31:40.049 "name": "COMP_lvs0/lv0", 00:31:40.049 "base_bdev_name": "215a0c22-7aa0-4716-9d8c-ff607fd6257c" 00:31:40.049 } 00:31:40.049 } 00:31:40.049 } 00:31:40.049 ] 00:31:40.049 20:45:32 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:40.049 20:45:32 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:40.049 [2024-07-15 20:45:32.407142] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2bb81b1350 PMD being used: compress_qat 00:31:40.049 I/O targets: 00:31:40.049 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:40.049 00:31:40.049 00:31:40.049 CUnit - A unit testing framework for C - Version 2.1-3 00:31:40.049 http://cunit.sourceforge.net/ 00:31:40.049 00:31:40.049 00:31:40.049 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:40.049 Test: blockdev write read block ...passed 00:31:40.049 Test: blockdev write zeroes read block ...passed 00:31:40.049 Test: blockdev write zeroes read no split ...passed 00:31:40.307 Test: blockdev write zeroes read split ...passed 00:31:40.307 Test: blockdev write zeroes read split partial ...passed 00:31:40.307 Test: blockdev reset ...[2024-07-15 20:45:32.510638] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:40.307 passed 00:31:40.307 Test: blockdev write read 8 blocks ...passed 00:31:40.307 Test: blockdev write read size > 128k ...passed 00:31:40.307 Test: blockdev write read invalid size ...passed 00:31:40.307 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:40.307 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:40.307 Test: blockdev write read max offset ...passed 00:31:40.307 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:40.307 Test: blockdev writev readv 8 blocks ...passed 00:31:40.307 Test: blockdev writev readv 30 x 1block ...passed 00:31:40.307 Test: blockdev writev readv block ...passed 00:31:40.307 Test: blockdev writev readv size > 128k ...passed 00:31:40.307 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:40.307 Test: blockdev comparev and writev ...passed 00:31:40.307 Test: blockdev nvme passthru rw ...passed 00:31:40.307 Test: blockdev nvme passthru vendor specific ...passed 00:31:40.307 Test: blockdev nvme admin passthru ...passed 00:31:40.307 Test: blockdev copy ...passed 00:31:40.307 00:31:40.307 Run Summary: Type Total Ran Passed Failed Inactive 00:31:40.307 suites 1 1 n/a 0 0 00:31:40.307 tests 23 23 23 0 0 00:31:40.307 asserts 130 130 130 0 n/a 00:31:40.307 00:31:40.307 Elapsed time = 0.235 seconds 00:31:40.307 0 00:31:40.307 20:45:32 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:31:40.307 20:45:32 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:40.565 20:45:32 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:40.823 20:45:32 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:40.823 20:45:32 compress_compdev -- compress/compress.sh@62 -- # killprocess 1527753 00:31:40.823 20:45:32 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1527753 ']' 00:31:40.823 20:45:32 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1527753 00:31:40.823 20:45:32 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:40.823 20:45:33 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:40.823 20:45:33 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1527753 00:31:40.823 20:45:33 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:40.823 20:45:33 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:40.823 20:45:33 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1527753' 00:31:40.823 killing process with pid 1527753 00:31:40.823 20:45:33 compress_compdev -- common/autotest_common.sh@967 -- # kill 1527753 00:31:40.823 20:45:33 compress_compdev -- common/autotest_common.sh@972 -- # wait 1527753 00:31:44.176 20:45:36 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:44.176 20:45:36 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:44.177 00:31:44.177 real 0m52.035s 00:31:44.177 user 1m59.493s 00:31:44.177 sys 0m6.879s 00:31:44.177 20:45:36 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:44.177 20:45:36 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:44.177 ************************************ 00:31:44.177 END TEST compress_compdev 00:31:44.177 ************************************ 00:31:44.177 20:45:36 -- common/autotest_common.sh@1142 -- # return 0 00:31:44.177 20:45:36 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:44.177 20:45:36 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:44.177 20:45:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:44.177 20:45:36 -- common/autotest_common.sh@10 -- # set +x 00:31:44.177 ************************************ 00:31:44.177 START TEST compress_isal 00:31:44.177 ************************************ 00:31:44.177 20:45:36 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:44.177 * Looking for test storage... 00:31:44.177 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:44.177 20:45:36 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:44.177 20:45:36 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:44.177 20:45:36 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:44.177 20:45:36 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:44.177 20:45:36 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:44.177 20:45:36 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:44.177 20:45:36 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:44.177 20:45:36 compress_isal -- paths/export.sh@5 -- # export PATH 00:31:44.177 20:45:36 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@47 -- # : 0 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:44.177 20:45:36 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:44.177 20:45:36 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:44.177 20:45:36 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:44.177 20:45:36 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:31:44.177 20:45:36 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:44.177 20:45:36 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:44.177 20:45:36 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1529105 00:31:44.177 20:45:36 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:44.177 20:45:36 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1529105 00:31:44.177 20:45:36 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1529105 ']' 00:31:44.177 20:45:36 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:44.177 20:45:36 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:44.177 20:45:36 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:44.177 20:45:36 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:44.177 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:44.177 20:45:36 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:44.177 20:45:36 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:44.177 [2024-07-15 20:45:36.373026] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:31:44.177 [2024-07-15 20:45:36.373098] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1529105 ] 00:31:44.177 [2024-07-15 20:45:36.507243] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:44.435 [2024-07-15 20:45:36.625944] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:44.435 [2024-07-15 20:45:36.625952] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:45.002 20:45:37 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:45.002 20:45:37 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:45.002 20:45:37 compress_isal -- compress/compress.sh@74 -- # create_vols 00:31:45.002 20:45:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:45.002 20:45:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:45.937 20:45:38 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:45.937 20:45:38 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:45.937 20:45:38 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:45.937 20:45:38 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:45.937 20:45:38 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:45.937 20:45:38 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:45.937 20:45:38 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:46.196 20:45:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:46.196 [ 00:31:46.196 { 00:31:46.196 "name": "Nvme0n1", 00:31:46.196 "aliases": [ 00:31:46.196 "01000000-0000-0000-5cd2-e43197705251" 00:31:46.196 ], 00:31:46.196 "product_name": "NVMe disk", 00:31:46.196 "block_size": 512, 00:31:46.196 "num_blocks": 15002931888, 00:31:46.196 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:46.196 "assigned_rate_limits": { 00:31:46.196 "rw_ios_per_sec": 0, 00:31:46.196 "rw_mbytes_per_sec": 0, 00:31:46.196 "r_mbytes_per_sec": 0, 00:31:46.196 "w_mbytes_per_sec": 0 00:31:46.196 }, 00:31:46.196 "claimed": false, 00:31:46.196 "zoned": false, 00:31:46.196 "supported_io_types": { 00:31:46.196 "read": true, 00:31:46.196 "write": true, 00:31:46.196 "unmap": true, 00:31:46.196 "flush": true, 00:31:46.196 "reset": true, 00:31:46.196 "nvme_admin": true, 00:31:46.196 "nvme_io": true, 00:31:46.196 "nvme_io_md": false, 00:31:46.196 "write_zeroes": true, 00:31:46.196 "zcopy": false, 00:31:46.196 "get_zone_info": false, 00:31:46.196 "zone_management": false, 00:31:46.196 "zone_append": false, 00:31:46.196 "compare": false, 00:31:46.196 "compare_and_write": false, 00:31:46.196 "abort": true, 00:31:46.196 "seek_hole": false, 00:31:46.196 "seek_data": false, 00:31:46.196 "copy": false, 00:31:46.196 "nvme_iov_md": false 00:31:46.196 }, 00:31:46.196 "driver_specific": { 00:31:46.196 "nvme": [ 00:31:46.196 { 00:31:46.196 "pci_address": "0000:5e:00.0", 00:31:46.196 "trid": { 00:31:46.196 "trtype": "PCIe", 00:31:46.196 "traddr": "0000:5e:00.0" 00:31:46.196 }, 00:31:46.196 "ctrlr_data": { 00:31:46.196 "cntlid": 0, 00:31:46.196 "vendor_id": "0x8086", 00:31:46.196 "model_number": "INTEL SSDPF2KX076TZO", 00:31:46.196 "serial_number": "PHAC0301002G7P6CGN", 00:31:46.196 "firmware_revision": "JCV10200", 00:31:46.196 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:46.196 "oacs": { 00:31:46.196 "security": 1, 00:31:46.196 "format": 1, 00:31:46.196 "firmware": 1, 00:31:46.196 "ns_manage": 1 00:31:46.196 }, 00:31:46.196 "multi_ctrlr": false, 00:31:46.196 "ana_reporting": false 00:31:46.196 }, 00:31:46.196 "vs": { 00:31:46.196 "nvme_version": "1.3" 00:31:46.196 }, 00:31:46.196 "ns_data": { 00:31:46.196 "id": 1, 00:31:46.196 "can_share": false 00:31:46.196 }, 00:31:46.196 "security": { 00:31:46.196 "opal": true 00:31:46.196 } 00:31:46.196 } 00:31:46.196 ], 00:31:46.196 "mp_policy": "active_passive" 00:31:46.196 } 00:31:46.196 } 00:31:46.196 ] 00:31:46.454 20:45:38 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:46.454 20:45:38 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:48.983 82aad719-3156-4a4c-956b-5fa212235be7 00:31:48.983 20:45:41 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:48.983 f6592183-6837-45a9-972a-3a4d413e7d9b 00:31:48.983 20:45:41 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:48.983 20:45:41 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:48.983 20:45:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:48.983 20:45:41 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:48.983 20:45:41 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:48.983 20:45:41 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:48.983 20:45:41 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:49.240 20:45:41 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:49.498 [ 00:31:49.498 { 00:31:49.498 "name": "f6592183-6837-45a9-972a-3a4d413e7d9b", 00:31:49.498 "aliases": [ 00:31:49.498 "lvs0/lv0" 00:31:49.498 ], 00:31:49.498 "product_name": "Logical Volume", 00:31:49.498 "block_size": 512, 00:31:49.498 "num_blocks": 204800, 00:31:49.498 "uuid": "f6592183-6837-45a9-972a-3a4d413e7d9b", 00:31:49.498 "assigned_rate_limits": { 00:31:49.498 "rw_ios_per_sec": 0, 00:31:49.498 "rw_mbytes_per_sec": 0, 00:31:49.498 "r_mbytes_per_sec": 0, 00:31:49.498 "w_mbytes_per_sec": 0 00:31:49.498 }, 00:31:49.498 "claimed": false, 00:31:49.498 "zoned": false, 00:31:49.498 "supported_io_types": { 00:31:49.498 "read": true, 00:31:49.498 "write": true, 00:31:49.498 "unmap": true, 00:31:49.498 "flush": false, 00:31:49.498 "reset": true, 00:31:49.498 "nvme_admin": false, 00:31:49.498 "nvme_io": false, 00:31:49.498 "nvme_io_md": false, 00:31:49.498 "write_zeroes": true, 00:31:49.498 "zcopy": false, 00:31:49.498 "get_zone_info": false, 00:31:49.498 "zone_management": false, 00:31:49.498 "zone_append": false, 00:31:49.498 "compare": false, 00:31:49.498 "compare_and_write": false, 00:31:49.498 "abort": false, 00:31:49.498 "seek_hole": true, 00:31:49.498 "seek_data": true, 00:31:49.498 "copy": false, 00:31:49.498 "nvme_iov_md": false 00:31:49.498 }, 00:31:49.498 "driver_specific": { 00:31:49.498 "lvol": { 00:31:49.498 "lvol_store_uuid": "82aad719-3156-4a4c-956b-5fa212235be7", 00:31:49.498 "base_bdev": "Nvme0n1", 00:31:49.498 "thin_provision": true, 00:31:49.498 "num_allocated_clusters": 0, 00:31:49.498 "snapshot": false, 00:31:49.498 "clone": false, 00:31:49.498 "esnap_clone": false 00:31:49.498 } 00:31:49.498 } 00:31:49.498 } 00:31:49.498 ] 00:31:49.498 20:45:41 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:49.498 20:45:41 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:49.498 20:45:41 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:49.498 [2024-07-15 20:45:41.857603] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:49.498 COMP_lvs0/lv0 00:31:49.756 20:45:41 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:49.756 20:45:41 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:49.756 20:45:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:49.756 20:45:41 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:49.756 20:45:41 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:49.756 20:45:41 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:49.756 20:45:41 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:50.014 20:45:42 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:50.014 [ 00:31:50.014 { 00:31:50.014 "name": "COMP_lvs0/lv0", 00:31:50.014 "aliases": [ 00:31:50.014 "fc3fd9f7-f834-5d97-8a70-91d22f2e8874" 00:31:50.014 ], 00:31:50.014 "product_name": "compress", 00:31:50.014 "block_size": 512, 00:31:50.014 "num_blocks": 200704, 00:31:50.014 "uuid": "fc3fd9f7-f834-5d97-8a70-91d22f2e8874", 00:31:50.014 "assigned_rate_limits": { 00:31:50.014 "rw_ios_per_sec": 0, 00:31:50.014 "rw_mbytes_per_sec": 0, 00:31:50.014 "r_mbytes_per_sec": 0, 00:31:50.014 "w_mbytes_per_sec": 0 00:31:50.014 }, 00:31:50.014 "claimed": false, 00:31:50.014 "zoned": false, 00:31:50.014 "supported_io_types": { 00:31:50.014 "read": true, 00:31:50.014 "write": true, 00:31:50.014 "unmap": false, 00:31:50.014 "flush": false, 00:31:50.014 "reset": false, 00:31:50.014 "nvme_admin": false, 00:31:50.014 "nvme_io": false, 00:31:50.014 "nvme_io_md": false, 00:31:50.014 "write_zeroes": true, 00:31:50.014 "zcopy": false, 00:31:50.014 "get_zone_info": false, 00:31:50.014 "zone_management": false, 00:31:50.014 "zone_append": false, 00:31:50.014 "compare": false, 00:31:50.014 "compare_and_write": false, 00:31:50.014 "abort": false, 00:31:50.014 "seek_hole": false, 00:31:50.014 "seek_data": false, 00:31:50.014 "copy": false, 00:31:50.014 "nvme_iov_md": false 00:31:50.014 }, 00:31:50.014 "driver_specific": { 00:31:50.014 "compress": { 00:31:50.014 "name": "COMP_lvs0/lv0", 00:31:50.014 "base_bdev_name": "f6592183-6837-45a9-972a-3a4d413e7d9b" 00:31:50.014 } 00:31:50.014 } 00:31:50.014 } 00:31:50.014 ] 00:31:50.272 20:45:42 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:50.272 20:45:42 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:50.272 Running I/O for 3 seconds... 00:31:53.552 00:31:53.552 Latency(us) 00:31:53.552 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:53.552 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:53.552 Verification LBA range: start 0x0 length 0x3100 00:31:53.552 COMP_lvs0/lv0 : 3.01 1263.53 4.94 0.00 0.00 25216.46 2478.97 21883.33 00:31:53.552 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:53.552 Verification LBA range: start 0x3100 length 0x3100 00:31:53.552 COMP_lvs0/lv0 : 3.01 1265.44 4.94 0.00 0.00 25146.90 1503.05 20629.59 00:31:53.552 =================================================================================================================== 00:31:53.552 Total : 2528.98 9.88 0.00 0.00 25181.65 1503.05 21883.33 00:31:53.552 0 00:31:53.552 20:45:45 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:53.552 20:45:45 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:53.810 20:45:46 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:54.068 20:45:46 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:54.068 20:45:46 compress_isal -- compress/compress.sh@78 -- # killprocess 1529105 00:31:54.068 20:45:46 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1529105 ']' 00:31:54.068 20:45:46 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1529105 00:31:54.068 20:45:46 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:54.068 20:45:46 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:54.068 20:45:46 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1529105 00:31:54.068 20:45:46 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:54.068 20:45:46 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:54.068 20:45:46 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1529105' 00:31:54.068 killing process with pid 1529105 00:31:54.068 20:45:46 compress_isal -- common/autotest_common.sh@967 -- # kill 1529105 00:31:54.068 Received shutdown signal, test time was about 3.000000 seconds 00:31:54.068 00:31:54.068 Latency(us) 00:31:54.068 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:54.068 =================================================================================================================== 00:31:54.068 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:54.068 20:45:46 compress_isal -- common/autotest_common.sh@972 -- # wait 1529105 00:31:57.351 20:45:49 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:57.351 20:45:49 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:57.351 20:45:49 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1530818 00:31:57.351 20:45:49 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:57.351 20:45:49 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:57.351 20:45:49 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1530818 00:31:57.351 20:45:49 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1530818 ']' 00:31:57.351 20:45:49 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:57.351 20:45:49 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:57.351 20:45:49 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:57.351 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:57.351 20:45:49 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:57.351 20:45:49 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:57.351 [2024-07-15 20:45:49.565362] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:31:57.351 [2024-07-15 20:45:49.565513] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1530818 ] 00:31:57.610 [2024-07-15 20:45:49.766906] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:57.610 [2024-07-15 20:45:49.897264] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:57.610 [2024-07-15 20:45:49.897271] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:58.174 20:45:50 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:58.174 20:45:50 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:58.174 20:45:50 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:31:58.174 20:45:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:58.174 20:45:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:58.739 20:45:51 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:58.739 20:45:51 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:58.739 20:45:51 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:58.739 20:45:51 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:58.739 20:45:51 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:58.739 20:45:51 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:58.739 20:45:51 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:59.303 20:45:51 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:59.560 [ 00:31:59.560 { 00:31:59.560 "name": "Nvme0n1", 00:31:59.560 "aliases": [ 00:31:59.560 "01000000-0000-0000-5cd2-e43197705251" 00:31:59.560 ], 00:31:59.560 "product_name": "NVMe disk", 00:31:59.560 "block_size": 512, 00:31:59.560 "num_blocks": 15002931888, 00:31:59.560 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:59.560 "assigned_rate_limits": { 00:31:59.560 "rw_ios_per_sec": 0, 00:31:59.560 "rw_mbytes_per_sec": 0, 00:31:59.560 "r_mbytes_per_sec": 0, 00:31:59.560 "w_mbytes_per_sec": 0 00:31:59.560 }, 00:31:59.560 "claimed": false, 00:31:59.560 "zoned": false, 00:31:59.560 "supported_io_types": { 00:31:59.560 "read": true, 00:31:59.560 "write": true, 00:31:59.560 "unmap": true, 00:31:59.560 "flush": true, 00:31:59.560 "reset": true, 00:31:59.560 "nvme_admin": true, 00:31:59.560 "nvme_io": true, 00:31:59.560 "nvme_io_md": false, 00:31:59.560 "write_zeroes": true, 00:31:59.560 "zcopy": false, 00:31:59.560 "get_zone_info": false, 00:31:59.560 "zone_management": false, 00:31:59.560 "zone_append": false, 00:31:59.560 "compare": false, 00:31:59.560 "compare_and_write": false, 00:31:59.560 "abort": true, 00:31:59.560 "seek_hole": false, 00:31:59.560 "seek_data": false, 00:31:59.560 "copy": false, 00:31:59.560 "nvme_iov_md": false 00:31:59.560 }, 00:31:59.560 "driver_specific": { 00:31:59.560 "nvme": [ 00:31:59.560 { 00:31:59.560 "pci_address": "0000:5e:00.0", 00:31:59.560 "trid": { 00:31:59.560 "trtype": "PCIe", 00:31:59.560 "traddr": "0000:5e:00.0" 00:31:59.560 }, 00:31:59.560 "ctrlr_data": { 00:31:59.560 "cntlid": 0, 00:31:59.560 "vendor_id": "0x8086", 00:31:59.560 "model_number": "INTEL SSDPF2KX076TZO", 00:31:59.560 "serial_number": "PHAC0301002G7P6CGN", 00:31:59.560 "firmware_revision": "JCV10200", 00:31:59.560 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:59.560 "oacs": { 00:31:59.560 "security": 1, 00:31:59.560 "format": 1, 00:31:59.560 "firmware": 1, 00:31:59.560 "ns_manage": 1 00:31:59.560 }, 00:31:59.560 "multi_ctrlr": false, 00:31:59.560 "ana_reporting": false 00:31:59.560 }, 00:31:59.560 "vs": { 00:31:59.560 "nvme_version": "1.3" 00:31:59.560 }, 00:31:59.560 "ns_data": { 00:31:59.560 "id": 1, 00:31:59.560 "can_share": false 00:31:59.560 }, 00:31:59.560 "security": { 00:31:59.560 "opal": true 00:31:59.560 } 00:31:59.560 } 00:31:59.560 ], 00:31:59.560 "mp_policy": "active_passive" 00:31:59.560 } 00:31:59.560 } 00:31:59.560 ] 00:31:59.560 20:45:51 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:59.560 20:45:51 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:02.090 35304b44-e94f-4f62-a3d0-e7a447aba9e7 00:32:02.090 20:45:54 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:02.657 fe5b98a9-e148-4ecd-9a73-106dae7c141e 00:32:02.657 20:45:54 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:02.657 20:45:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:02.657 20:45:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:02.657 20:45:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:02.657 20:45:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:02.657 20:45:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:02.657 20:45:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:02.915 20:45:55 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:03.513 [ 00:32:03.513 { 00:32:03.513 "name": "fe5b98a9-e148-4ecd-9a73-106dae7c141e", 00:32:03.513 "aliases": [ 00:32:03.513 "lvs0/lv0" 00:32:03.513 ], 00:32:03.513 "product_name": "Logical Volume", 00:32:03.513 "block_size": 512, 00:32:03.513 "num_blocks": 204800, 00:32:03.513 "uuid": "fe5b98a9-e148-4ecd-9a73-106dae7c141e", 00:32:03.513 "assigned_rate_limits": { 00:32:03.513 "rw_ios_per_sec": 0, 00:32:03.513 "rw_mbytes_per_sec": 0, 00:32:03.513 "r_mbytes_per_sec": 0, 00:32:03.513 "w_mbytes_per_sec": 0 00:32:03.513 }, 00:32:03.513 "claimed": false, 00:32:03.513 "zoned": false, 00:32:03.513 "supported_io_types": { 00:32:03.513 "read": true, 00:32:03.513 "write": true, 00:32:03.513 "unmap": true, 00:32:03.513 "flush": false, 00:32:03.513 "reset": true, 00:32:03.513 "nvme_admin": false, 00:32:03.513 "nvme_io": false, 00:32:03.513 "nvme_io_md": false, 00:32:03.513 "write_zeroes": true, 00:32:03.513 "zcopy": false, 00:32:03.513 "get_zone_info": false, 00:32:03.513 "zone_management": false, 00:32:03.513 "zone_append": false, 00:32:03.513 "compare": false, 00:32:03.513 "compare_and_write": false, 00:32:03.513 "abort": false, 00:32:03.513 "seek_hole": true, 00:32:03.513 "seek_data": true, 00:32:03.513 "copy": false, 00:32:03.513 "nvme_iov_md": false 00:32:03.513 }, 00:32:03.513 "driver_specific": { 00:32:03.513 "lvol": { 00:32:03.513 "lvol_store_uuid": "35304b44-e94f-4f62-a3d0-e7a447aba9e7", 00:32:03.513 "base_bdev": "Nvme0n1", 00:32:03.513 "thin_provision": true, 00:32:03.513 "num_allocated_clusters": 0, 00:32:03.513 "snapshot": false, 00:32:03.513 "clone": false, 00:32:03.513 "esnap_clone": false 00:32:03.513 } 00:32:03.513 } 00:32:03.513 } 00:32:03.513 ] 00:32:03.513 20:45:55 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:03.513 20:45:55 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:32:03.513 20:45:55 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:32:03.513 [2024-07-15 20:45:55.873943] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:03.513 COMP_lvs0/lv0 00:32:03.772 20:45:55 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:03.772 20:45:55 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:03.772 20:45:55 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:03.772 20:45:55 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:03.772 20:45:55 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:03.772 20:45:55 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:03.772 20:45:55 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:04.031 20:45:56 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:04.031 [ 00:32:04.031 { 00:32:04.031 "name": "COMP_lvs0/lv0", 00:32:04.031 "aliases": [ 00:32:04.031 "43804798-9033-5b6f-8e2b-a52985cc7d35" 00:32:04.031 ], 00:32:04.031 "product_name": "compress", 00:32:04.031 "block_size": 512, 00:32:04.031 "num_blocks": 200704, 00:32:04.031 "uuid": "43804798-9033-5b6f-8e2b-a52985cc7d35", 00:32:04.031 "assigned_rate_limits": { 00:32:04.031 "rw_ios_per_sec": 0, 00:32:04.031 "rw_mbytes_per_sec": 0, 00:32:04.031 "r_mbytes_per_sec": 0, 00:32:04.031 "w_mbytes_per_sec": 0 00:32:04.031 }, 00:32:04.031 "claimed": false, 00:32:04.031 "zoned": false, 00:32:04.031 "supported_io_types": { 00:32:04.031 "read": true, 00:32:04.031 "write": true, 00:32:04.031 "unmap": false, 00:32:04.031 "flush": false, 00:32:04.031 "reset": false, 00:32:04.031 "nvme_admin": false, 00:32:04.031 "nvme_io": false, 00:32:04.031 "nvme_io_md": false, 00:32:04.031 "write_zeroes": true, 00:32:04.031 "zcopy": false, 00:32:04.031 "get_zone_info": false, 00:32:04.031 "zone_management": false, 00:32:04.031 "zone_append": false, 00:32:04.031 "compare": false, 00:32:04.031 "compare_and_write": false, 00:32:04.031 "abort": false, 00:32:04.031 "seek_hole": false, 00:32:04.031 "seek_data": false, 00:32:04.031 "copy": false, 00:32:04.031 "nvme_iov_md": false 00:32:04.031 }, 00:32:04.031 "driver_specific": { 00:32:04.031 "compress": { 00:32:04.031 "name": "COMP_lvs0/lv0", 00:32:04.031 "base_bdev_name": "fe5b98a9-e148-4ecd-9a73-106dae7c141e" 00:32:04.031 } 00:32:04.031 } 00:32:04.031 } 00:32:04.031 ] 00:32:04.031 20:45:56 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:04.032 20:45:56 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:04.290 Running I/O for 3 seconds... 00:32:07.570 00:32:07.570 Latency(us) 00:32:07.570 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:07.570 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:07.570 Verification LBA range: start 0x0 length 0x3100 00:32:07.570 COMP_lvs0/lv0 : 3.01 1267.66 4.95 0.00 0.00 25137.46 2137.04 21769.35 00:32:07.570 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:07.570 Verification LBA range: start 0x3100 length 0x3100 00:32:07.570 COMP_lvs0/lv0 : 3.01 1268.95 4.96 0.00 0.00 25075.41 1510.18 20173.69 00:32:07.570 =================================================================================================================== 00:32:07.570 Total : 2536.62 9.91 0.00 0.00 25106.42 1510.18 21769.35 00:32:07.570 0 00:32:07.570 20:45:59 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:32:07.570 20:45:59 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:07.570 20:45:59 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:07.828 20:46:00 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:07.828 20:46:00 compress_isal -- compress/compress.sh@78 -- # killprocess 1530818 00:32:07.828 20:46:00 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1530818 ']' 00:32:07.828 20:46:00 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1530818 00:32:07.828 20:46:00 compress_isal -- common/autotest_common.sh@953 -- # uname 00:32:07.828 20:46:00 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:07.828 20:46:00 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1530818 00:32:07.828 20:46:00 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:07.828 20:46:00 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:07.828 20:46:00 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1530818' 00:32:07.828 killing process with pid 1530818 00:32:07.828 20:46:00 compress_isal -- common/autotest_common.sh@967 -- # kill 1530818 00:32:07.828 Received shutdown signal, test time was about 3.000000 seconds 00:32:07.828 00:32:07.828 Latency(us) 00:32:07.828 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:07.828 =================================================================================================================== 00:32:07.828 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:07.828 20:46:00 compress_isal -- common/autotest_common.sh@972 -- # wait 1530818 00:32:11.108 20:46:03 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:32:11.108 20:46:03 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:11.108 20:46:03 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1532595 00:32:11.108 20:46:03 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:11.108 20:46:03 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:32:11.108 20:46:03 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1532595 00:32:11.108 20:46:03 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1532595 ']' 00:32:11.108 20:46:03 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:11.108 20:46:03 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:11.108 20:46:03 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:11.108 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:11.108 20:46:03 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:11.108 20:46:03 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:11.108 [2024-07-15 20:46:03.177713] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:32:11.108 [2024-07-15 20:46:03.177789] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1532595 ] 00:32:11.108 [2024-07-15 20:46:03.313961] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:11.108 [2024-07-15 20:46:03.434756] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:11.108 [2024-07-15 20:46:03.434762] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:12.042 20:46:04 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:12.042 20:46:04 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:32:12.042 20:46:04 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:32:12.042 20:46:04 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:12.042 20:46:04 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:12.609 20:46:04 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:12.609 20:46:04 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:12.609 20:46:04 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:12.609 20:46:04 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:12.609 20:46:04 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:12.609 20:46:04 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:12.609 20:46:04 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:12.868 20:46:05 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:12.868 [ 00:32:12.868 { 00:32:12.868 "name": "Nvme0n1", 00:32:12.868 "aliases": [ 00:32:12.868 "01000000-0000-0000-5cd2-e43197705251" 00:32:12.868 ], 00:32:12.868 "product_name": "NVMe disk", 00:32:12.868 "block_size": 512, 00:32:12.868 "num_blocks": 15002931888, 00:32:12.868 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:12.868 "assigned_rate_limits": { 00:32:12.868 "rw_ios_per_sec": 0, 00:32:12.868 "rw_mbytes_per_sec": 0, 00:32:12.868 "r_mbytes_per_sec": 0, 00:32:12.868 "w_mbytes_per_sec": 0 00:32:12.868 }, 00:32:12.868 "claimed": false, 00:32:12.868 "zoned": false, 00:32:12.868 "supported_io_types": { 00:32:12.868 "read": true, 00:32:12.868 "write": true, 00:32:12.868 "unmap": true, 00:32:12.868 "flush": true, 00:32:12.868 "reset": true, 00:32:12.868 "nvme_admin": true, 00:32:12.868 "nvme_io": true, 00:32:12.868 "nvme_io_md": false, 00:32:12.868 "write_zeroes": true, 00:32:12.868 "zcopy": false, 00:32:12.868 "get_zone_info": false, 00:32:12.868 "zone_management": false, 00:32:12.868 "zone_append": false, 00:32:12.868 "compare": false, 00:32:12.868 "compare_and_write": false, 00:32:12.868 "abort": true, 00:32:12.868 "seek_hole": false, 00:32:12.868 "seek_data": false, 00:32:12.868 "copy": false, 00:32:12.868 "nvme_iov_md": false 00:32:12.868 }, 00:32:12.868 "driver_specific": { 00:32:12.868 "nvme": [ 00:32:12.868 { 00:32:12.868 "pci_address": "0000:5e:00.0", 00:32:12.868 "trid": { 00:32:12.868 "trtype": "PCIe", 00:32:12.868 "traddr": "0000:5e:00.0" 00:32:12.868 }, 00:32:12.868 "ctrlr_data": { 00:32:12.868 "cntlid": 0, 00:32:12.868 "vendor_id": "0x8086", 00:32:12.868 "model_number": "INTEL SSDPF2KX076TZO", 00:32:12.868 "serial_number": "PHAC0301002G7P6CGN", 00:32:12.868 "firmware_revision": "JCV10200", 00:32:12.868 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:12.868 "oacs": { 00:32:12.868 "security": 1, 00:32:12.868 "format": 1, 00:32:12.868 "firmware": 1, 00:32:12.868 "ns_manage": 1 00:32:12.868 }, 00:32:12.868 "multi_ctrlr": false, 00:32:12.868 "ana_reporting": false 00:32:12.868 }, 00:32:12.868 "vs": { 00:32:12.868 "nvme_version": "1.3" 00:32:12.868 }, 00:32:12.868 "ns_data": { 00:32:12.868 "id": 1, 00:32:12.868 "can_share": false 00:32:12.868 }, 00:32:12.868 "security": { 00:32:12.868 "opal": true 00:32:12.868 } 00:32:12.868 } 00:32:12.868 ], 00:32:12.868 "mp_policy": "active_passive" 00:32:12.868 } 00:32:12.868 } 00:32:12.868 ] 00:32:13.128 20:46:05 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:13.128 20:46:05 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:15.666 ab2451c2-c8ca-4440-b2e6-9b91ee5424d1 00:32:15.666 20:46:07 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:15.666 f9c574f2-fae3-4bf9-91da-8e906b095687 00:32:15.666 20:46:07 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:15.666 20:46:07 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:15.666 20:46:07 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:15.666 20:46:07 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:15.666 20:46:07 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:15.666 20:46:07 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:15.666 20:46:07 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:16.233 20:46:08 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:16.492 [ 00:32:16.492 { 00:32:16.492 "name": "f9c574f2-fae3-4bf9-91da-8e906b095687", 00:32:16.492 "aliases": [ 00:32:16.492 "lvs0/lv0" 00:32:16.492 ], 00:32:16.492 "product_name": "Logical Volume", 00:32:16.492 "block_size": 512, 00:32:16.492 "num_blocks": 204800, 00:32:16.492 "uuid": "f9c574f2-fae3-4bf9-91da-8e906b095687", 00:32:16.492 "assigned_rate_limits": { 00:32:16.492 "rw_ios_per_sec": 0, 00:32:16.492 "rw_mbytes_per_sec": 0, 00:32:16.492 "r_mbytes_per_sec": 0, 00:32:16.492 "w_mbytes_per_sec": 0 00:32:16.492 }, 00:32:16.492 "claimed": false, 00:32:16.492 "zoned": false, 00:32:16.492 "supported_io_types": { 00:32:16.492 "read": true, 00:32:16.492 "write": true, 00:32:16.492 "unmap": true, 00:32:16.492 "flush": false, 00:32:16.492 "reset": true, 00:32:16.492 "nvme_admin": false, 00:32:16.492 "nvme_io": false, 00:32:16.492 "nvme_io_md": false, 00:32:16.492 "write_zeroes": true, 00:32:16.492 "zcopy": false, 00:32:16.492 "get_zone_info": false, 00:32:16.492 "zone_management": false, 00:32:16.492 "zone_append": false, 00:32:16.492 "compare": false, 00:32:16.492 "compare_and_write": false, 00:32:16.492 "abort": false, 00:32:16.492 "seek_hole": true, 00:32:16.492 "seek_data": true, 00:32:16.492 "copy": false, 00:32:16.492 "nvme_iov_md": false 00:32:16.492 }, 00:32:16.492 "driver_specific": { 00:32:16.492 "lvol": { 00:32:16.492 "lvol_store_uuid": "ab2451c2-c8ca-4440-b2e6-9b91ee5424d1", 00:32:16.492 "base_bdev": "Nvme0n1", 00:32:16.492 "thin_provision": true, 00:32:16.492 "num_allocated_clusters": 0, 00:32:16.492 "snapshot": false, 00:32:16.492 "clone": false, 00:32:16.492 "esnap_clone": false 00:32:16.492 } 00:32:16.492 } 00:32:16.492 } 00:32:16.492 ] 00:32:16.492 20:46:08 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:16.492 20:46:08 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:32:16.492 20:46:08 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:32:16.750 [2024-07-15 20:46:08.989136] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:16.750 COMP_lvs0/lv0 00:32:16.750 20:46:09 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:16.750 20:46:09 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:16.750 20:46:09 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:16.750 20:46:09 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:16.750 20:46:09 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:16.750 20:46:09 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:16.750 20:46:09 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:17.008 20:46:09 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:17.266 [ 00:32:17.266 { 00:32:17.266 "name": "COMP_lvs0/lv0", 00:32:17.266 "aliases": [ 00:32:17.266 "caa1f62a-c468-5d00-b978-00d15d90c091" 00:32:17.266 ], 00:32:17.266 "product_name": "compress", 00:32:17.266 "block_size": 4096, 00:32:17.266 "num_blocks": 25088, 00:32:17.266 "uuid": "caa1f62a-c468-5d00-b978-00d15d90c091", 00:32:17.266 "assigned_rate_limits": { 00:32:17.266 "rw_ios_per_sec": 0, 00:32:17.266 "rw_mbytes_per_sec": 0, 00:32:17.266 "r_mbytes_per_sec": 0, 00:32:17.266 "w_mbytes_per_sec": 0 00:32:17.266 }, 00:32:17.266 "claimed": false, 00:32:17.266 "zoned": false, 00:32:17.266 "supported_io_types": { 00:32:17.266 "read": true, 00:32:17.266 "write": true, 00:32:17.266 "unmap": false, 00:32:17.266 "flush": false, 00:32:17.266 "reset": false, 00:32:17.266 "nvme_admin": false, 00:32:17.266 "nvme_io": false, 00:32:17.266 "nvme_io_md": false, 00:32:17.266 "write_zeroes": true, 00:32:17.266 "zcopy": false, 00:32:17.266 "get_zone_info": false, 00:32:17.266 "zone_management": false, 00:32:17.266 "zone_append": false, 00:32:17.266 "compare": false, 00:32:17.266 "compare_and_write": false, 00:32:17.266 "abort": false, 00:32:17.266 "seek_hole": false, 00:32:17.266 "seek_data": false, 00:32:17.266 "copy": false, 00:32:17.266 "nvme_iov_md": false 00:32:17.266 }, 00:32:17.266 "driver_specific": { 00:32:17.266 "compress": { 00:32:17.266 "name": "COMP_lvs0/lv0", 00:32:17.266 "base_bdev_name": "f9c574f2-fae3-4bf9-91da-8e906b095687" 00:32:17.266 } 00:32:17.266 } 00:32:17.266 } 00:32:17.266 ] 00:32:17.266 20:46:09 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:17.266 20:46:09 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:17.524 Running I/O for 3 seconds... 00:32:20.809 00:32:20.809 Latency(us) 00:32:20.809 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:20.809 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:20.809 Verification LBA range: start 0x0 length 0x3100 00:32:20.809 COMP_lvs0/lv0 : 3.01 1289.52 5.04 0.00 0.00 24701.36 2222.53 21313.45 00:32:20.809 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:20.809 Verification LBA range: start 0x3100 length 0x3100 00:32:20.809 COMP_lvs0/lv0 : 3.01 1292.13 5.05 0.00 0.00 24627.94 1481.68 20173.69 00:32:20.809 =================================================================================================================== 00:32:20.809 Total : 2581.65 10.08 0.00 0.00 24664.61 1481.68 21313.45 00:32:20.809 0 00:32:20.809 20:46:12 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:32:20.809 20:46:12 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:20.809 20:46:12 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:21.068 20:46:13 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:21.068 20:46:13 compress_isal -- compress/compress.sh@78 -- # killprocess 1532595 00:32:21.068 20:46:13 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1532595 ']' 00:32:21.068 20:46:13 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1532595 00:32:21.068 20:46:13 compress_isal -- common/autotest_common.sh@953 -- # uname 00:32:21.068 20:46:13 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:21.068 20:46:13 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1532595 00:32:21.068 20:46:13 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:21.068 20:46:13 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:21.068 20:46:13 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1532595' 00:32:21.068 killing process with pid 1532595 00:32:21.068 20:46:13 compress_isal -- common/autotest_common.sh@967 -- # kill 1532595 00:32:21.068 Received shutdown signal, test time was about 3.000000 seconds 00:32:21.068 00:32:21.068 Latency(us) 00:32:21.068 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:21.068 =================================================================================================================== 00:32:21.068 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:21.068 20:46:13 compress_isal -- common/autotest_common.sh@972 -- # wait 1532595 00:32:24.348 20:46:16 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:32:24.348 20:46:16 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:24.348 20:46:16 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1534201 00:32:24.348 20:46:16 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:24.348 20:46:16 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:32:24.348 20:46:16 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1534201 00:32:24.348 20:46:16 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1534201 ']' 00:32:24.348 20:46:16 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:24.348 20:46:16 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:24.348 20:46:16 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:24.348 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:24.348 20:46:16 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:24.348 20:46:16 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:24.348 [2024-07-15 20:46:16.382750] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:32:24.348 [2024-07-15 20:46:16.382826] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1534201 ] 00:32:24.348 [2024-07-15 20:46:16.511279] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:24.348 [2024-07-15 20:46:16.621676] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:24.348 [2024-07-15 20:46:16.621778] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:24.348 [2024-07-15 20:46:16.621779] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:25.297 20:46:17 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:25.297 20:46:17 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:32:25.297 20:46:17 compress_isal -- compress/compress.sh@58 -- # create_vols 00:32:25.297 20:46:17 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:25.297 20:46:17 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:25.862 20:46:17 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:25.862 20:46:17 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:25.862 20:46:17 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:25.862 20:46:17 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:25.862 20:46:17 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:25.862 20:46:17 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:25.862 20:46:17 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:25.862 20:46:18 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:26.119 [ 00:32:26.119 { 00:32:26.119 "name": "Nvme0n1", 00:32:26.119 "aliases": [ 00:32:26.119 "01000000-0000-0000-5cd2-e43197705251" 00:32:26.119 ], 00:32:26.119 "product_name": "NVMe disk", 00:32:26.119 "block_size": 512, 00:32:26.120 "num_blocks": 15002931888, 00:32:26.120 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:26.120 "assigned_rate_limits": { 00:32:26.120 "rw_ios_per_sec": 0, 00:32:26.120 "rw_mbytes_per_sec": 0, 00:32:26.120 "r_mbytes_per_sec": 0, 00:32:26.120 "w_mbytes_per_sec": 0 00:32:26.120 }, 00:32:26.120 "claimed": false, 00:32:26.120 "zoned": false, 00:32:26.120 "supported_io_types": { 00:32:26.120 "read": true, 00:32:26.120 "write": true, 00:32:26.120 "unmap": true, 00:32:26.120 "flush": true, 00:32:26.120 "reset": true, 00:32:26.120 "nvme_admin": true, 00:32:26.120 "nvme_io": true, 00:32:26.120 "nvme_io_md": false, 00:32:26.120 "write_zeroes": true, 00:32:26.120 "zcopy": false, 00:32:26.120 "get_zone_info": false, 00:32:26.120 "zone_management": false, 00:32:26.120 "zone_append": false, 00:32:26.120 "compare": false, 00:32:26.120 "compare_and_write": false, 00:32:26.120 "abort": true, 00:32:26.120 "seek_hole": false, 00:32:26.120 "seek_data": false, 00:32:26.120 "copy": false, 00:32:26.120 "nvme_iov_md": false 00:32:26.120 }, 00:32:26.120 "driver_specific": { 00:32:26.120 "nvme": [ 00:32:26.120 { 00:32:26.120 "pci_address": "0000:5e:00.0", 00:32:26.120 "trid": { 00:32:26.120 "trtype": "PCIe", 00:32:26.120 "traddr": "0000:5e:00.0" 00:32:26.120 }, 00:32:26.120 "ctrlr_data": { 00:32:26.120 "cntlid": 0, 00:32:26.120 "vendor_id": "0x8086", 00:32:26.120 "model_number": "INTEL SSDPF2KX076TZO", 00:32:26.120 "serial_number": "PHAC0301002G7P6CGN", 00:32:26.120 "firmware_revision": "JCV10200", 00:32:26.120 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:26.120 "oacs": { 00:32:26.120 "security": 1, 00:32:26.120 "format": 1, 00:32:26.120 "firmware": 1, 00:32:26.120 "ns_manage": 1 00:32:26.120 }, 00:32:26.120 "multi_ctrlr": false, 00:32:26.120 "ana_reporting": false 00:32:26.120 }, 00:32:26.120 "vs": { 00:32:26.120 "nvme_version": "1.3" 00:32:26.120 }, 00:32:26.120 "ns_data": { 00:32:26.120 "id": 1, 00:32:26.120 "can_share": false 00:32:26.120 }, 00:32:26.120 "security": { 00:32:26.120 "opal": true 00:32:26.120 } 00:32:26.120 } 00:32:26.120 ], 00:32:26.120 "mp_policy": "active_passive" 00:32:26.120 } 00:32:26.120 } 00:32:26.120 ] 00:32:26.120 20:46:18 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:26.120 20:46:18 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:28.648 d75eec1e-bbe1-4bb6-8212-fabbca185bb1 00:32:28.648 20:46:20 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:28.906 ca7234bd-e7b0-49ff-b6b9-133d819ef453 00:32:28.906 20:46:21 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:28.906 20:46:21 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:28.906 20:46:21 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:28.906 20:46:21 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:28.906 20:46:21 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:28.906 20:46:21 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:28.906 20:46:21 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:29.164 20:46:21 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:29.164 [ 00:32:29.164 { 00:32:29.164 "name": "ca7234bd-e7b0-49ff-b6b9-133d819ef453", 00:32:29.164 "aliases": [ 00:32:29.164 "lvs0/lv0" 00:32:29.164 ], 00:32:29.164 "product_name": "Logical Volume", 00:32:29.164 "block_size": 512, 00:32:29.164 "num_blocks": 204800, 00:32:29.164 "uuid": "ca7234bd-e7b0-49ff-b6b9-133d819ef453", 00:32:29.164 "assigned_rate_limits": { 00:32:29.164 "rw_ios_per_sec": 0, 00:32:29.164 "rw_mbytes_per_sec": 0, 00:32:29.164 "r_mbytes_per_sec": 0, 00:32:29.164 "w_mbytes_per_sec": 0 00:32:29.164 }, 00:32:29.164 "claimed": false, 00:32:29.164 "zoned": false, 00:32:29.164 "supported_io_types": { 00:32:29.164 "read": true, 00:32:29.164 "write": true, 00:32:29.164 "unmap": true, 00:32:29.164 "flush": false, 00:32:29.164 "reset": true, 00:32:29.164 "nvme_admin": false, 00:32:29.164 "nvme_io": false, 00:32:29.164 "nvme_io_md": false, 00:32:29.164 "write_zeroes": true, 00:32:29.164 "zcopy": false, 00:32:29.164 "get_zone_info": false, 00:32:29.164 "zone_management": false, 00:32:29.164 "zone_append": false, 00:32:29.164 "compare": false, 00:32:29.164 "compare_and_write": false, 00:32:29.164 "abort": false, 00:32:29.164 "seek_hole": true, 00:32:29.164 "seek_data": true, 00:32:29.164 "copy": false, 00:32:29.164 "nvme_iov_md": false 00:32:29.164 }, 00:32:29.164 "driver_specific": { 00:32:29.164 "lvol": { 00:32:29.164 "lvol_store_uuid": "d75eec1e-bbe1-4bb6-8212-fabbca185bb1", 00:32:29.164 "base_bdev": "Nvme0n1", 00:32:29.164 "thin_provision": true, 00:32:29.164 "num_allocated_clusters": 0, 00:32:29.164 "snapshot": false, 00:32:29.164 "clone": false, 00:32:29.164 "esnap_clone": false 00:32:29.164 } 00:32:29.164 } 00:32:29.164 } 00:32:29.164 ] 00:32:29.164 20:46:21 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:29.164 20:46:21 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:32:29.164 20:46:21 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:32:29.422 [2024-07-15 20:46:21.769110] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:29.422 COMP_lvs0/lv0 00:32:29.422 20:46:21 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:29.422 20:46:21 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:29.680 20:46:21 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:29.680 20:46:21 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:29.680 20:46:21 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:29.680 20:46:21 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:29.680 20:46:21 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:29.938 20:46:22 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:29.938 [ 00:32:29.938 { 00:32:29.938 "name": "COMP_lvs0/lv0", 00:32:29.938 "aliases": [ 00:32:29.938 "dd41bec2-7f0e-59dc-8381-80321c3b4a0e" 00:32:29.938 ], 00:32:29.938 "product_name": "compress", 00:32:29.938 "block_size": 512, 00:32:29.938 "num_blocks": 200704, 00:32:29.938 "uuid": "dd41bec2-7f0e-59dc-8381-80321c3b4a0e", 00:32:29.938 "assigned_rate_limits": { 00:32:29.938 "rw_ios_per_sec": 0, 00:32:29.938 "rw_mbytes_per_sec": 0, 00:32:29.938 "r_mbytes_per_sec": 0, 00:32:29.938 "w_mbytes_per_sec": 0 00:32:29.938 }, 00:32:29.938 "claimed": false, 00:32:29.938 "zoned": false, 00:32:29.938 "supported_io_types": { 00:32:29.938 "read": true, 00:32:29.938 "write": true, 00:32:29.938 "unmap": false, 00:32:29.938 "flush": false, 00:32:29.938 "reset": false, 00:32:29.938 "nvme_admin": false, 00:32:29.938 "nvme_io": false, 00:32:29.938 "nvme_io_md": false, 00:32:29.938 "write_zeroes": true, 00:32:29.938 "zcopy": false, 00:32:29.938 "get_zone_info": false, 00:32:29.938 "zone_management": false, 00:32:29.938 "zone_append": false, 00:32:29.938 "compare": false, 00:32:29.938 "compare_and_write": false, 00:32:29.938 "abort": false, 00:32:29.938 "seek_hole": false, 00:32:29.938 "seek_data": false, 00:32:29.938 "copy": false, 00:32:29.938 "nvme_iov_md": false 00:32:29.938 }, 00:32:29.938 "driver_specific": { 00:32:29.938 "compress": { 00:32:29.938 "name": "COMP_lvs0/lv0", 00:32:29.938 "base_bdev_name": "ca7234bd-e7b0-49ff-b6b9-133d819ef453" 00:32:29.938 } 00:32:29.938 } 00:32:29.938 } 00:32:29.938 ] 00:32:29.938 20:46:22 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:29.938 20:46:22 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:30.196 I/O targets: 00:32:30.196 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:32:30.196 00:32:30.196 00:32:30.196 CUnit - A unit testing framework for C - Version 2.1-3 00:32:30.196 http://cunit.sourceforge.net/ 00:32:30.196 00:32:30.196 00:32:30.196 Suite: bdevio tests on: COMP_lvs0/lv0 00:32:30.196 Test: blockdev write read block ...passed 00:32:30.196 Test: blockdev write zeroes read block ...passed 00:32:30.196 Test: blockdev write zeroes read no split ...passed 00:32:30.196 Test: blockdev write zeroes read split ...passed 00:32:30.196 Test: blockdev write zeroes read split partial ...passed 00:32:30.197 Test: blockdev reset ...[2024-07-15 20:46:22.505061] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:32:30.197 passed 00:32:30.197 Test: blockdev write read 8 blocks ...passed 00:32:30.197 Test: blockdev write read size > 128k ...passed 00:32:30.197 Test: blockdev write read invalid size ...passed 00:32:30.197 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:30.197 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:30.197 Test: blockdev write read max offset ...passed 00:32:30.197 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:30.197 Test: blockdev writev readv 8 blocks ...passed 00:32:30.197 Test: blockdev writev readv 30 x 1block ...passed 00:32:30.197 Test: blockdev writev readv block ...passed 00:32:30.197 Test: blockdev writev readv size > 128k ...passed 00:32:30.197 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:30.197 Test: blockdev comparev and writev ...passed 00:32:30.197 Test: blockdev nvme passthru rw ...passed 00:32:30.197 Test: blockdev nvme passthru vendor specific ...passed 00:32:30.197 Test: blockdev nvme admin passthru ...passed 00:32:30.197 Test: blockdev copy ...passed 00:32:30.197 00:32:30.197 Run Summary: Type Total Ran Passed Failed Inactive 00:32:30.197 suites 1 1 n/a 0 0 00:32:30.197 tests 23 23 23 0 0 00:32:30.197 asserts 130 130 130 0 n/a 00:32:30.197 00:32:30.197 Elapsed time = 0.291 seconds 00:32:30.197 0 00:32:30.197 20:46:22 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:32:30.197 20:46:22 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:30.455 20:46:22 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:30.713 20:46:23 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:32:30.713 20:46:23 compress_isal -- compress/compress.sh@62 -- # killprocess 1534201 00:32:30.713 20:46:23 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1534201 ']' 00:32:30.713 20:46:23 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1534201 00:32:30.713 20:46:23 compress_isal -- common/autotest_common.sh@953 -- # uname 00:32:30.713 20:46:23 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:30.713 20:46:23 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1534201 00:32:30.971 20:46:23 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:30.971 20:46:23 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:30.971 20:46:23 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1534201' 00:32:30.971 killing process with pid 1534201 00:32:30.971 20:46:23 compress_isal -- common/autotest_common.sh@967 -- # kill 1534201 00:32:30.971 20:46:23 compress_isal -- common/autotest_common.sh@972 -- # wait 1534201 00:32:34.250 20:46:26 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:32:34.250 20:46:26 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:32:34.250 00:32:34.250 real 0m49.944s 00:32:34.250 user 1m56.779s 00:32:34.250 sys 0m4.789s 00:32:34.250 20:46:26 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:34.250 20:46:26 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:34.250 ************************************ 00:32:34.250 END TEST compress_isal 00:32:34.250 ************************************ 00:32:34.250 20:46:26 -- common/autotest_common.sh@1142 -- # return 0 00:32:34.250 20:46:26 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:32:34.250 20:46:26 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:32:34.250 20:46:26 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:32:34.250 20:46:26 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:34.250 20:46:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:34.250 20:46:26 -- common/autotest_common.sh@10 -- # set +x 00:32:34.250 ************************************ 00:32:34.250 START TEST blockdev_crypto_aesni 00:32:34.250 ************************************ 00:32:34.250 20:46:26 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:32:34.250 * Looking for test storage... 00:32:34.250 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1535527 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:34.250 20:46:26 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1535527 00:32:34.250 20:46:26 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 1535527 ']' 00:32:34.250 20:46:26 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:34.250 20:46:26 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:34.250 20:46:26 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:34.250 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:34.250 20:46:26 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:34.250 20:46:26 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:34.250 [2024-07-15 20:46:26.378013] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:32:34.250 [2024-07-15 20:46:26.378086] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1535527 ] 00:32:34.250 [2024-07-15 20:46:26.505749] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:34.250 [2024-07-15 20:46:26.608980] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:35.184 20:46:27 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:35.184 20:46:27 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:32:35.184 20:46:27 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:32:35.184 20:46:27 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:32:35.184 20:46:27 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:32:35.184 20:46:27 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:35.184 20:46:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:35.184 [2024-07-15 20:46:27.315201] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:35.184 [2024-07-15 20:46:27.323237] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:35.184 [2024-07-15 20:46:27.331251] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:35.184 [2024-07-15 20:46:27.405095] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:37.712 true 00:32:37.712 true 00:32:37.712 true 00:32:37.712 true 00:32:37.712 Malloc0 00:32:37.712 Malloc1 00:32:37.712 Malloc2 00:32:37.712 Malloc3 00:32:37.712 [2024-07-15 20:46:29.824929] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:37.712 crypto_ram 00:32:37.712 [2024-07-15 20:46:29.832944] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:37.712 crypto_ram2 00:32:37.712 [2024-07-15 20:46:29.840968] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:37.712 crypto_ram3 00:32:37.712 [2024-07-15 20:46:29.848984] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:37.712 crypto_ram4 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:37.712 20:46:29 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:37.712 20:46:29 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:32:37.712 20:46:29 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:37.712 20:46:29 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:37.712 20:46:29 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:37.712 20:46:29 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:32:37.712 20:46:29 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:32:37.712 20:46:29 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:37.712 20:46:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:37.712 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:37.712 20:46:30 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:32:37.712 20:46:30 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:32:37.713 20:46:30 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c839f62c-9ec8-5cd1-986e-2c365898e4ff"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c839f62c-9ec8-5cd1-986e-2c365898e4ff",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "c4307c9b-ef12-54ee-ad36-209480910fdf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c4307c9b-ef12-54ee-ad36-209480910fdf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "afaf475a-3ce0-54d2-85ba-0566b967f86b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "afaf475a-3ce0-54d2-85ba-0566b967f86b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "73e3642a-0078-54e7-88a8-035ca71f7a4c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "73e3642a-0078-54e7-88a8-035ca71f7a4c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:37.970 20:46:30 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:32:37.970 20:46:30 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:32:37.970 20:46:30 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:32:37.970 20:46:30 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 1535527 00:32:37.970 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 1535527 ']' 00:32:37.970 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 1535527 00:32:37.970 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:32:37.970 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:37.970 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1535527 00:32:37.970 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:37.970 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:37.970 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1535527' 00:32:37.970 killing process with pid 1535527 00:32:37.970 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 1535527 00:32:37.970 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 1535527 00:32:38.535 20:46:30 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:38.535 20:46:30 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:38.535 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:32:38.535 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:38.535 20:46:30 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:38.535 ************************************ 00:32:38.535 START TEST bdev_hello_world 00:32:38.535 ************************************ 00:32:38.535 20:46:30 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:38.535 [2024-07-15 20:46:30.876511] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:32:38.535 [2024-07-15 20:46:30.876582] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1536203 ] 00:32:38.792 [2024-07-15 20:46:31.003668] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:38.792 [2024-07-15 20:46:31.110622] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:38.792 [2024-07-15 20:46:31.131903] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:38.792 [2024-07-15 20:46:31.139934] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:38.792 [2024-07-15 20:46:31.147959] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:39.050 [2024-07-15 20:46:31.256547] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:41.575 [2024-07-15 20:46:33.490973] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:41.575 [2024-07-15 20:46:33.491047] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:41.575 [2024-07-15 20:46:33.491063] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:41.575 [2024-07-15 20:46:33.498984] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:41.575 [2024-07-15 20:46:33.499003] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:41.575 [2024-07-15 20:46:33.499015] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:41.575 [2024-07-15 20:46:33.507004] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:41.575 [2024-07-15 20:46:33.507027] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:41.575 [2024-07-15 20:46:33.507038] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:41.575 [2024-07-15 20:46:33.515025] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:41.575 [2024-07-15 20:46:33.515043] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:41.575 [2024-07-15 20:46:33.515055] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:41.575 [2024-07-15 20:46:33.587946] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:41.575 [2024-07-15 20:46:33.587990] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:41.575 [2024-07-15 20:46:33.588009] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:41.575 [2024-07-15 20:46:33.589311] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:41.575 [2024-07-15 20:46:33.589388] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:41.575 [2024-07-15 20:46:33.589404] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:41.575 [2024-07-15 20:46:33.589450] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:41.575 00:32:41.575 [2024-07-15 20:46:33.589469] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:41.834 00:32:41.834 real 0m3.153s 00:32:41.834 user 0m2.726s 00:32:41.834 sys 0m0.387s 00:32:41.834 20:46:33 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:41.834 20:46:33 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:41.834 ************************************ 00:32:41.834 END TEST bdev_hello_world 00:32:41.834 ************************************ 00:32:41.834 20:46:34 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:41.834 20:46:34 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:41.834 20:46:34 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:41.834 20:46:34 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:41.834 20:46:34 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:41.834 ************************************ 00:32:41.834 START TEST bdev_bounds 00:32:41.834 ************************************ 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1536610 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1536610' 00:32:41.834 Process bdevio pid: 1536610 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1536610 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1536610 ']' 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:41.834 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:41.834 20:46:34 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:41.834 [2024-07-15 20:46:34.157264] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:32:41.834 [2024-07-15 20:46:34.157405] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1536610 ] 00:32:42.092 [2024-07-15 20:46:34.354349] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:42.092 [2024-07-15 20:46:34.459069] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:42.092 [2024-07-15 20:46:34.459168] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:42.092 [2024-07-15 20:46:34.459171] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:42.350 [2024-07-15 20:46:34.480661] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:42.350 [2024-07-15 20:46:34.488680] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:42.350 [2024-07-15 20:46:34.496712] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:42.350 [2024-07-15 20:46:34.603955] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:44.921 [2024-07-15 20:46:36.813858] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:44.921 [2024-07-15 20:46:36.813949] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:44.921 [2024-07-15 20:46:36.813965] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:44.921 [2024-07-15 20:46:36.821875] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:44.921 [2024-07-15 20:46:36.821895] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:44.921 [2024-07-15 20:46:36.821907] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:44.921 [2024-07-15 20:46:36.829897] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:44.921 [2024-07-15 20:46:36.829919] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:44.921 [2024-07-15 20:46:36.829939] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:44.921 [2024-07-15 20:46:36.837922] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:44.921 [2024-07-15 20:46:36.837945] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:44.921 [2024-07-15 20:46:36.837957] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:44.921 20:46:36 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:44.921 20:46:36 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:44.921 20:46:36 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:44.921 I/O targets: 00:32:44.921 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:32:44.921 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:32:44.921 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:32:44.921 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:32:44.921 00:32:44.921 00:32:44.921 CUnit - A unit testing framework for C - Version 2.1-3 00:32:44.921 http://cunit.sourceforge.net/ 00:32:44.921 00:32:44.921 00:32:44.921 Suite: bdevio tests on: crypto_ram4 00:32:44.921 Test: blockdev write read block ...passed 00:32:44.921 Test: blockdev write zeroes read block ...passed 00:32:44.921 Test: blockdev write zeroes read no split ...passed 00:32:44.921 Test: blockdev write zeroes read split ...passed 00:32:44.921 Test: blockdev write zeroes read split partial ...passed 00:32:44.921 Test: blockdev reset ...passed 00:32:44.921 Test: blockdev write read 8 blocks ...passed 00:32:44.921 Test: blockdev write read size > 128k ...passed 00:32:44.921 Test: blockdev write read invalid size ...passed 00:32:44.921 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:44.921 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:44.921 Test: blockdev write read max offset ...passed 00:32:44.921 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:44.921 Test: blockdev writev readv 8 blocks ...passed 00:32:44.921 Test: blockdev writev readv 30 x 1block ...passed 00:32:44.921 Test: blockdev writev readv block ...passed 00:32:44.921 Test: blockdev writev readv size > 128k ...passed 00:32:44.921 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:44.921 Test: blockdev comparev and writev ...passed 00:32:44.921 Test: blockdev nvme passthru rw ...passed 00:32:44.921 Test: blockdev nvme passthru vendor specific ...passed 00:32:44.921 Test: blockdev nvme admin passthru ...passed 00:32:44.921 Test: blockdev copy ...passed 00:32:44.921 Suite: bdevio tests on: crypto_ram3 00:32:44.921 Test: blockdev write read block ...passed 00:32:44.921 Test: blockdev write zeroes read block ...passed 00:32:44.921 Test: blockdev write zeroes read no split ...passed 00:32:44.921 Test: blockdev write zeroes read split ...passed 00:32:44.921 Test: blockdev write zeroes read split partial ...passed 00:32:44.922 Test: blockdev reset ...passed 00:32:44.922 Test: blockdev write read 8 blocks ...passed 00:32:44.922 Test: blockdev write read size > 128k ...passed 00:32:44.922 Test: blockdev write read invalid size ...passed 00:32:44.922 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:44.922 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:44.922 Test: blockdev write read max offset ...passed 00:32:44.922 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:44.922 Test: blockdev writev readv 8 blocks ...passed 00:32:44.922 Test: blockdev writev readv 30 x 1block ...passed 00:32:44.922 Test: blockdev writev readv block ...passed 00:32:44.922 Test: blockdev writev readv size > 128k ...passed 00:32:44.922 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:44.922 Test: blockdev comparev and writev ...passed 00:32:44.922 Test: blockdev nvme passthru rw ...passed 00:32:44.922 Test: blockdev nvme passthru vendor specific ...passed 00:32:44.922 Test: blockdev nvme admin passthru ...passed 00:32:44.922 Test: blockdev copy ...passed 00:32:44.922 Suite: bdevio tests on: crypto_ram2 00:32:44.922 Test: blockdev write read block ...passed 00:32:44.922 Test: blockdev write zeroes read block ...passed 00:32:44.922 Test: blockdev write zeroes read no split ...passed 00:32:45.180 Test: blockdev write zeroes read split ...passed 00:32:45.180 Test: blockdev write zeroes read split partial ...passed 00:32:45.180 Test: blockdev reset ...passed 00:32:45.180 Test: blockdev write read 8 blocks ...passed 00:32:45.180 Test: blockdev write read size > 128k ...passed 00:32:45.180 Test: blockdev write read invalid size ...passed 00:32:45.180 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:45.180 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:45.180 Test: blockdev write read max offset ...passed 00:32:45.180 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:45.180 Test: blockdev writev readv 8 blocks ...passed 00:32:45.180 Test: blockdev writev readv 30 x 1block ...passed 00:32:45.180 Test: blockdev writev readv block ...passed 00:32:45.180 Test: blockdev writev readv size > 128k ...passed 00:32:45.180 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:45.180 Test: blockdev comparev and writev ...passed 00:32:45.180 Test: blockdev nvme passthru rw ...passed 00:32:45.180 Test: blockdev nvme passthru vendor specific ...passed 00:32:45.180 Test: blockdev nvme admin passthru ...passed 00:32:45.180 Test: blockdev copy ...passed 00:32:45.180 Suite: bdevio tests on: crypto_ram 00:32:45.180 Test: blockdev write read block ...passed 00:32:45.180 Test: blockdev write zeroes read block ...passed 00:32:45.438 Test: blockdev write zeroes read no split ...passed 00:32:45.438 Test: blockdev write zeroes read split ...passed 00:32:45.697 Test: blockdev write zeroes read split partial ...passed 00:32:45.697 Test: blockdev reset ...passed 00:32:45.697 Test: blockdev write read 8 blocks ...passed 00:32:45.697 Test: blockdev write read size > 128k ...passed 00:32:45.697 Test: blockdev write read invalid size ...passed 00:32:45.697 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:45.697 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:45.697 Test: blockdev write read max offset ...passed 00:32:45.697 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:45.697 Test: blockdev writev readv 8 blocks ...passed 00:32:45.697 Test: blockdev writev readv 30 x 1block ...passed 00:32:45.697 Test: blockdev writev readv block ...passed 00:32:45.697 Test: blockdev writev readv size > 128k ...passed 00:32:45.697 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:45.697 Test: blockdev comparev and writev ...passed 00:32:45.697 Test: blockdev nvme passthru rw ...passed 00:32:45.697 Test: blockdev nvme passthru vendor specific ...passed 00:32:45.697 Test: blockdev nvme admin passthru ...passed 00:32:45.697 Test: blockdev copy ...passed 00:32:45.697 00:32:45.697 Run Summary: Type Total Ran Passed Failed Inactive 00:32:45.697 suites 4 4 n/a 0 0 00:32:45.697 tests 92 92 92 0 0 00:32:45.697 asserts 520 520 520 0 n/a 00:32:45.697 00:32:45.697 Elapsed time = 1.638 seconds 00:32:45.697 0 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1536610 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1536610 ']' 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1536610 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1536610 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1536610' 00:32:45.697 killing process with pid 1536610 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1536610 00:32:45.697 20:46:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1536610 00:32:45.957 20:46:38 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:45.957 00:32:45.957 real 0m4.275s 00:32:45.957 user 0m11.265s 00:32:45.957 sys 0m0.634s 00:32:45.957 20:46:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:45.957 20:46:38 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:45.957 ************************************ 00:32:45.957 END TEST bdev_bounds 00:32:45.957 ************************************ 00:32:46.216 20:46:38 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:46.216 20:46:38 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:46.216 20:46:38 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:46.216 20:46:38 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:46.216 20:46:38 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:46.216 ************************************ 00:32:46.216 START TEST bdev_nbd 00:32:46.216 ************************************ 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1537168 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1537168 /var/tmp/spdk-nbd.sock 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1537168 ']' 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:46.216 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:46.216 20:46:38 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:46.216 [2024-07-15 20:46:38.477550] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:32:46.216 [2024-07-15 20:46:38.477616] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:46.475 [2024-07-15 20:46:38.606140] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:46.475 [2024-07-15 20:46:38.710579] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:46.475 [2024-07-15 20:46:38.731850] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:46.475 [2024-07-15 20:46:38.739871] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:46.475 [2024-07-15 20:46:38.747889] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:46.475 [2024-07-15 20:46:38.853555] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:49.007 [2024-07-15 20:46:41.090467] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:49.007 [2024-07-15 20:46:41.090536] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:49.007 [2024-07-15 20:46:41.090551] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:49.007 [2024-07-15 20:46:41.098487] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:49.007 [2024-07-15 20:46:41.098506] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:49.007 [2024-07-15 20:46:41.098517] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:49.007 [2024-07-15 20:46:41.106507] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:49.007 [2024-07-15 20:46:41.106524] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:49.007 [2024-07-15 20:46:41.106535] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:49.007 [2024-07-15 20:46:41.114528] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:49.007 [2024-07-15 20:46:41.114545] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:49.008 [2024-07-15 20:46:41.114556] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:49.008 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:49.267 1+0 records in 00:32:49.267 1+0 records out 00:32:49.267 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289937 s, 14.1 MB/s 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:49.267 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:49.526 1+0 records in 00:32:49.526 1+0 records out 00:32:49.526 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307248 s, 13.3 MB/s 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:49.526 20:46:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:49.784 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:32:49.784 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:32:49.784 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:32:49.784 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:32:49.784 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:49.784 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:49.784 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:49.784 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:32:49.784 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:49.785 1+0 records in 00:32:49.785 1+0 records out 00:32:49.785 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299832 s, 13.7 MB/s 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:49.785 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:50.043 1+0 records in 00:32:50.043 1+0 records out 00:32:50.043 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333645 s, 12.3 MB/s 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:50.043 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:50.300 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:50.300 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:50.300 { 00:32:50.300 "nbd_device": "/dev/nbd0", 00:32:50.300 "bdev_name": "crypto_ram" 00:32:50.300 }, 00:32:50.300 { 00:32:50.300 "nbd_device": "/dev/nbd1", 00:32:50.300 "bdev_name": "crypto_ram2" 00:32:50.300 }, 00:32:50.300 { 00:32:50.300 "nbd_device": "/dev/nbd2", 00:32:50.300 "bdev_name": "crypto_ram3" 00:32:50.300 }, 00:32:50.300 { 00:32:50.300 "nbd_device": "/dev/nbd3", 00:32:50.300 "bdev_name": "crypto_ram4" 00:32:50.300 } 00:32:50.300 ]' 00:32:50.300 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:50.300 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:50.300 { 00:32:50.300 "nbd_device": "/dev/nbd0", 00:32:50.300 "bdev_name": "crypto_ram" 00:32:50.300 }, 00:32:50.300 { 00:32:50.300 "nbd_device": "/dev/nbd1", 00:32:50.300 "bdev_name": "crypto_ram2" 00:32:50.300 }, 00:32:50.300 { 00:32:50.300 "nbd_device": "/dev/nbd2", 00:32:50.300 "bdev_name": "crypto_ram3" 00:32:50.300 }, 00:32:50.300 { 00:32:50.300 "nbd_device": "/dev/nbd3", 00:32:50.301 "bdev_name": "crypto_ram4" 00:32:50.301 } 00:32:50.301 ]' 00:32:50.301 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:50.301 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:32:50.301 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:50.301 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:32:50.301 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:50.301 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:50.301 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:50.301 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:50.558 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:50.558 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:50.558 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:50.558 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:50.558 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:50.558 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:50.558 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:50.558 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:50.558 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:50.558 20:46:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:50.815 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:51.073 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:51.073 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:51.073 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:51.073 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:51.073 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:51.073 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:51.073 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:51.073 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:51.073 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:51.331 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:51.590 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:51.590 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:51.590 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:51.849 20:46:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:52.108 /dev/nbd0 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:52.108 1+0 records in 00:32:52.108 1+0 records out 00:32:52.108 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287887 s, 14.2 MB/s 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:52.108 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:32:52.367 /dev/nbd1 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:52.367 1+0 records in 00:32:52.367 1+0 records out 00:32:52.367 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000326435 s, 12.5 MB/s 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:52.367 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:32:52.625 /dev/nbd10 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:52.625 1+0 records in 00:32:52.625 1+0 records out 00:32:52.625 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299645 s, 13.7 MB/s 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:52.625 20:46:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:32:52.884 /dev/nbd11 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:52.884 1+0 records in 00:32:52.884 1+0 records out 00:32:52.884 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315632 s, 13.0 MB/s 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:52.884 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:53.142 { 00:32:53.142 "nbd_device": "/dev/nbd0", 00:32:53.142 "bdev_name": "crypto_ram" 00:32:53.142 }, 00:32:53.142 { 00:32:53.142 "nbd_device": "/dev/nbd1", 00:32:53.142 "bdev_name": "crypto_ram2" 00:32:53.142 }, 00:32:53.142 { 00:32:53.142 "nbd_device": "/dev/nbd10", 00:32:53.142 "bdev_name": "crypto_ram3" 00:32:53.142 }, 00:32:53.142 { 00:32:53.142 "nbd_device": "/dev/nbd11", 00:32:53.142 "bdev_name": "crypto_ram4" 00:32:53.142 } 00:32:53.142 ]' 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:53.142 { 00:32:53.142 "nbd_device": "/dev/nbd0", 00:32:53.142 "bdev_name": "crypto_ram" 00:32:53.142 }, 00:32:53.142 { 00:32:53.142 "nbd_device": "/dev/nbd1", 00:32:53.142 "bdev_name": "crypto_ram2" 00:32:53.142 }, 00:32:53.142 { 00:32:53.142 "nbd_device": "/dev/nbd10", 00:32:53.142 "bdev_name": "crypto_ram3" 00:32:53.142 }, 00:32:53.142 { 00:32:53.142 "nbd_device": "/dev/nbd11", 00:32:53.142 "bdev_name": "crypto_ram4" 00:32:53.142 } 00:32:53.142 ]' 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:53.142 /dev/nbd1 00:32:53.142 /dev/nbd10 00:32:53.142 /dev/nbd11' 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:53.142 /dev/nbd1 00:32:53.142 /dev/nbd10 00:32:53.142 /dev/nbd11' 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:53.142 256+0 records in 00:32:53.142 256+0 records out 00:32:53.142 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00767379 s, 137 MB/s 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:53.142 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:53.401 256+0 records in 00:32:53.401 256+0 records out 00:32:53.401 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0619984 s, 16.9 MB/s 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:53.401 256+0 records in 00:32:53.401 256+0 records out 00:32:53.401 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0666368 s, 15.7 MB/s 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:32:53.401 256+0 records in 00:32:53.401 256+0 records out 00:32:53.401 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.060877 s, 17.2 MB/s 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:32:53.401 256+0 records in 00:32:53.401 256+0 records out 00:32:53.401 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0567561 s, 18.5 MB/s 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:53.401 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:53.660 20:46:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:53.919 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:53.919 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:53.919 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:53.919 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:53.919 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:53.919 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:53.919 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:53.919 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:53.919 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:53.919 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:54.177 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:54.177 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:54.177 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:54.177 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:54.177 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:54.177 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:54.177 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:54.177 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:54.177 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:54.177 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:32:54.434 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:32:54.434 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:32:54.434 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:32:54.434 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:54.434 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:54.434 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:32:54.434 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:54.434 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:54.434 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:54.434 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:54.692 20:46:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:54.950 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:55.209 malloc_lvol_verify 00:32:55.209 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:55.468 daeaddde-b78b-4021-8aaa-ad594b06e897 00:32:55.468 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:55.727 1c4f63fc-bd91-467c-9087-2aa6cbe730e7 00:32:55.727 20:46:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:55.985 /dev/nbd0 00:32:55.985 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:55.985 mke2fs 1.46.5 (30-Dec-2021) 00:32:55.985 Discarding device blocks: 0/4096 done 00:32:55.985 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:55.985 00:32:55.985 Allocating group tables: 0/1 done 00:32:55.985 Writing inode tables: 0/1 done 00:32:55.985 Creating journal (1024 blocks): done 00:32:55.985 Writing superblocks and filesystem accounting information: 0/1 done 00:32:55.985 00:32:55.985 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:55.985 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:55.985 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:55.985 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:55.985 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:55.985 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:55.985 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:55.985 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1537168 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1537168 ']' 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1537168 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1537168 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1537168' 00:32:56.244 killing process with pid 1537168 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1537168 00:32:56.244 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1537168 00:32:56.811 20:46:48 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:56.811 00:32:56.811 real 0m10.474s 00:32:56.811 user 0m13.742s 00:32:56.811 sys 0m4.222s 00:32:56.811 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:56.811 20:46:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:56.811 ************************************ 00:32:56.811 END TEST bdev_nbd 00:32:56.811 ************************************ 00:32:56.811 20:46:48 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:56.811 20:46:48 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:56.811 20:46:48 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:32:56.811 20:46:48 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:32:56.811 20:46:48 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:56.811 20:46:48 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:56.811 20:46:48 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:56.811 20:46:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:56.811 ************************************ 00:32:56.811 START TEST bdev_fio 00:32:56.811 ************************************ 00:32:56.811 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:32:56.811 20:46:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:56.811 20:46:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:56.811 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:56.811 20:46:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:56.811 20:46:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:56.811 20:46:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:56.812 20:46:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:56.812 ************************************ 00:32:56.812 START TEST bdev_fio_rw_verify 00:32:56.812 ************************************ 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:56.812 20:46:49 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:57.376 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:57.376 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:57.376 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:57.376 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:57.376 fio-3.35 00:32:57.376 Starting 4 threads 00:33:12.318 00:33:12.318 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1539207: Mon Jul 15 20:47:02 2024 00:33:12.318 read: IOPS=21.2k, BW=82.9MiB/s (87.0MB/s)(829MiB/10001msec) 00:33:12.318 slat (usec): min=10, max=435, avg=64.45, stdev=40.44 00:33:12.318 clat (usec): min=13, max=1564, avg=337.27, stdev=248.39 00:33:12.318 lat (usec): min=55, max=1760, avg=401.73, stdev=275.39 00:33:12.318 clat percentiles (usec): 00:33:12.318 | 50.000th=[ 265], 99.000th=[ 1156], 99.900th=[ 1352], 99.990th=[ 1401], 00:33:12.318 | 99.999th=[ 1549] 00:33:12.318 write: IOPS=23.4k, BW=91.3MiB/s (95.8MB/s)(890MiB/9746msec); 0 zone resets 00:33:12.318 slat (usec): min=18, max=1277, avg=76.80, stdev=41.17 00:33:12.318 clat (usec): min=34, max=2678, avg=409.12, stdev=291.07 00:33:12.318 lat (usec): min=60, max=2856, avg=485.91, stdev=318.73 00:33:12.318 clat percentiles (usec): 00:33:12.318 | 50.000th=[ 338], 99.000th=[ 1418], 99.900th=[ 1680], 99.990th=[ 1745], 00:33:12.318 | 99.999th=[ 2057] 00:33:12.318 bw ( KiB/s): min=72160, max=111496, per=97.64%, avg=91332.68, stdev=2787.00, samples=76 00:33:12.318 iops : min=18040, max=27874, avg=22833.16, stdev=696.75, samples=76 00:33:12.318 lat (usec) : 20=0.01%, 50=0.37%, 100=8.23%, 250=31.43%, 500=36.21% 00:33:12.318 lat (usec) : 750=13.52%, 1000=5.97% 00:33:12.318 lat (msec) : 2=4.28%, 4=0.01% 00:33:12.318 cpu : usr=99.62%, sys=0.00%, ctx=55, majf=0, minf=272 00:33:12.318 IO depths : 1=10.4%, 2=25.5%, 4=51.0%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:12.318 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:12.318 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:12.318 issued rwts: total=212349,227901,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:12.318 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:12.318 00:33:12.318 Run status group 0 (all jobs): 00:33:12.318 READ: bw=82.9MiB/s (87.0MB/s), 82.9MiB/s-82.9MiB/s (87.0MB/s-87.0MB/s), io=829MiB (870MB), run=10001-10001msec 00:33:12.318 WRITE: bw=91.3MiB/s (95.8MB/s), 91.3MiB/s-91.3MiB/s (95.8MB/s-95.8MB/s), io=890MiB (933MB), run=9746-9746msec 00:33:12.318 00:33:12.318 real 0m13.558s 00:33:12.318 user 0m46.021s 00:33:12.318 sys 0m0.526s 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:12.318 ************************************ 00:33:12.318 END TEST bdev_fio_rw_verify 00:33:12.318 ************************************ 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:12.318 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c839f62c-9ec8-5cd1-986e-2c365898e4ff"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c839f62c-9ec8-5cd1-986e-2c365898e4ff",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "c4307c9b-ef12-54ee-ad36-209480910fdf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c4307c9b-ef12-54ee-ad36-209480910fdf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "afaf475a-3ce0-54d2-85ba-0566b967f86b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "afaf475a-3ce0-54d2-85ba-0566b967f86b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "73e3642a-0078-54e7-88a8-035ca71f7a4c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "73e3642a-0078-54e7-88a8-035ca71f7a4c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:12.319 crypto_ram2 00:33:12.319 crypto_ram3 00:33:12.319 crypto_ram4 ]] 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c839f62c-9ec8-5cd1-986e-2c365898e4ff"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c839f62c-9ec8-5cd1-986e-2c365898e4ff",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "c4307c9b-ef12-54ee-ad36-209480910fdf"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c4307c9b-ef12-54ee-ad36-209480910fdf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "afaf475a-3ce0-54d2-85ba-0566b967f86b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "afaf475a-3ce0-54d2-85ba-0566b967f86b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "73e3642a-0078-54e7-88a8-035ca71f7a4c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "73e3642a-0078-54e7-88a8-035ca71f7a4c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:12.319 ************************************ 00:33:12.319 START TEST bdev_fio_trim 00:33:12.319 ************************************ 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:12.319 20:47:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:12.319 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:12.319 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:12.319 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:12.319 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:12.319 fio-3.35 00:33:12.319 Starting 4 threads 00:33:24.518 00:33:24.518 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1541059: Mon Jul 15 20:47:15 2024 00:33:24.518 write: IOPS=31.7k, BW=124MiB/s (130MB/s)(1239MiB/10001msec); 0 zone resets 00:33:24.518 slat (usec): min=17, max=1337, avg=70.92, stdev=38.53 00:33:24.518 clat (usec): min=56, max=2137, avg=322.79, stdev=197.85 00:33:24.518 lat (usec): min=81, max=2495, avg=393.71, stdev=221.70 00:33:24.518 clat percentiles (usec): 00:33:24.518 | 50.000th=[ 269], 99.000th=[ 955], 99.900th=[ 1123], 99.990th=[ 1319], 00:33:24.518 | 99.999th=[ 1942] 00:33:24.518 bw ( KiB/s): min=115248, max=180166, per=100.00%, avg=127469.37, stdev=6448.22, samples=76 00:33:24.518 iops : min=28812, max=45043, avg=31867.32, stdev=1612.16, samples=76 00:33:24.518 trim: IOPS=31.7k, BW=124MiB/s (130MB/s)(1239MiB/10001msec); 0 zone resets 00:33:24.518 slat (usec): min=6, max=404, avg=19.54, stdev= 8.39 00:33:24.518 clat (usec): min=65, max=1749, avg=303.98, stdev=150.23 00:33:24.518 lat (usec): min=76, max=1771, avg=323.53, stdev=154.28 00:33:24.518 clat percentiles (usec): 00:33:24.518 | 50.000th=[ 273], 99.000th=[ 725], 99.900th=[ 799], 99.990th=[ 963], 00:33:24.518 | 99.999th=[ 1352] 00:33:24.518 bw ( KiB/s): min=115240, max=180198, per=100.00%, avg=127471.05, stdev=6449.17, samples=76 00:33:24.518 iops : min=28810, max=45049, avg=31867.74, stdev=1612.29, samples=76 00:33:24.518 lat (usec) : 100=4.45%, 250=39.82%, 500=40.60%, 750=12.83%, 1000=1.99% 00:33:24.518 lat (msec) : 2=0.31%, 4=0.01% 00:33:24.518 cpu : usr=99.55%, sys=0.00%, ctx=74, majf=0, minf=98 00:33:24.518 IO depths : 1=7.5%, 2=26.4%, 4=52.9%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:24.518 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:24.518 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:24.518 issued rwts: total=0,317216,317216,0 short=0,0,0,0 dropped=0,0,0,0 00:33:24.518 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:24.518 00:33:24.518 Run status group 0 (all jobs): 00:33:24.518 WRITE: bw=124MiB/s (130MB/s), 124MiB/s-124MiB/s (130MB/s-130MB/s), io=1239MiB (1299MB), run=10001-10001msec 00:33:24.518 TRIM: bw=124MiB/s (130MB/s), 124MiB/s-124MiB/s (130MB/s-130MB/s), io=1239MiB (1299MB), run=10001-10001msec 00:33:24.518 00:33:24.518 real 0m13.637s 00:33:24.518 user 0m46.560s 00:33:24.518 sys 0m0.525s 00:33:24.518 20:47:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:24.518 20:47:16 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:24.518 ************************************ 00:33:24.518 END TEST bdev_fio_trim 00:33:24.518 ************************************ 00:33:24.518 20:47:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:24.518 20:47:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:24.518 20:47:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:24.518 20:47:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:24.518 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:24.518 20:47:16 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:24.518 00:33:24.518 real 0m27.555s 00:33:24.518 user 1m32.754s 00:33:24.518 sys 0m1.261s 00:33:24.518 20:47:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:24.518 20:47:16 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:24.518 ************************************ 00:33:24.518 END TEST bdev_fio 00:33:24.518 ************************************ 00:33:24.518 20:47:16 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:24.518 20:47:16 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:24.518 20:47:16 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:24.518 20:47:16 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:24.518 20:47:16 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:24.518 20:47:16 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:24.518 ************************************ 00:33:24.518 START TEST bdev_verify 00:33:24.518 ************************************ 00:33:24.518 20:47:16 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:24.518 [2024-07-15 20:47:16.670535] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:33:24.518 [2024-07-15 20:47:16.670598] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1542479 ] 00:33:24.518 [2024-07-15 20:47:16.800052] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:24.777 [2024-07-15 20:47:16.905267] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:24.777 [2024-07-15 20:47:16.905272] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:24.777 [2024-07-15 20:47:16.926626] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:24.777 [2024-07-15 20:47:16.934654] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:24.777 [2024-07-15 20:47:16.942680] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:24.777 [2024-07-15 20:47:17.050151] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:27.310 [2024-07-15 20:47:19.265031] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:27.310 [2024-07-15 20:47:19.265113] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:27.310 [2024-07-15 20:47:19.265129] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:27.310 [2024-07-15 20:47:19.273048] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:27.310 [2024-07-15 20:47:19.273068] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:27.310 [2024-07-15 20:47:19.273080] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:27.310 [2024-07-15 20:47:19.281070] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:27.310 [2024-07-15 20:47:19.281089] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:27.310 [2024-07-15 20:47:19.281100] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:27.310 [2024-07-15 20:47:19.289093] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:27.310 [2024-07-15 20:47:19.289111] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:27.310 [2024-07-15 20:47:19.289122] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:27.310 Running I/O for 5 seconds... 00:33:32.607 00:33:32.607 Latency(us) 00:33:32.607 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:32.607 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:32.607 Verification LBA range: start 0x0 length 0x1000 00:33:32.607 crypto_ram : 5.07 480.05 1.88 0.00 0.00 266107.96 5157.40 161389.52 00:33:32.607 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:32.607 Verification LBA range: start 0x1000 length 0x1000 00:33:32.607 crypto_ram : 5.08 382.83 1.50 0.00 0.00 332523.07 1787.99 200597.15 00:33:32.607 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:32.607 Verification LBA range: start 0x0 length 0x1000 00:33:32.607 crypto_ram2 : 5.07 479.77 1.87 0.00 0.00 265361.23 5328.36 149536.06 00:33:32.607 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:32.607 Verification LBA range: start 0x1000 length 0x1000 00:33:32.607 crypto_ram2 : 5.08 385.81 1.51 0.00 0.00 329107.28 2407.74 182361.04 00:33:32.607 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:32.607 Verification LBA range: start 0x0 length 0x1000 00:33:32.607 crypto_ram3 : 5.05 3701.97 14.46 0.00 0.00 34269.13 7123.48 26100.42 00:33:32.607 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:32.607 Verification LBA range: start 0x1000 length 0x1000 00:33:32.607 crypto_ram3 : 5.05 2987.94 11.67 0.00 0.00 42357.13 9289.02 30773.43 00:33:32.607 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:32.607 Verification LBA range: start 0x0 length 0x1000 00:33:32.607 crypto_ram4 : 5.05 3709.01 14.49 0.00 0.00 34121.60 1659.77 25758.50 00:33:32.607 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:32.607 Verification LBA range: start 0x1000 length 0x1000 00:33:32.607 crypto_ram4 : 5.07 3006.82 11.75 0.00 0.00 42031.54 2934.87 30317.52 00:33:32.607 =================================================================================================================== 00:33:32.607 Total : 15134.20 59.12 0.00 0.00 67202.77 1659.77 200597.15 00:33:32.607 00:33:32.607 real 0m8.259s 00:33:32.607 user 0m15.624s 00:33:32.607 sys 0m0.395s 00:33:32.607 20:47:24 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:32.607 20:47:24 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:32.607 ************************************ 00:33:32.607 END TEST bdev_verify 00:33:32.607 ************************************ 00:33:32.607 20:47:24 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:32.607 20:47:24 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:32.607 20:47:24 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:32.607 20:47:24 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:32.607 20:47:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:32.607 ************************************ 00:33:32.607 START TEST bdev_verify_big_io 00:33:32.607 ************************************ 00:33:32.607 20:47:24 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:32.866 [2024-07-15 20:47:25.013595] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:33:32.866 [2024-07-15 20:47:25.013656] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1543540 ] 00:33:32.866 [2024-07-15 20:47:25.142497] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:32.866 [2024-07-15 20:47:25.240028] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:32.866 [2024-07-15 20:47:25.240034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:33.125 [2024-07-15 20:47:25.261680] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:33.125 [2024-07-15 20:47:25.269711] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:33.125 [2024-07-15 20:47:25.277742] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:33.125 [2024-07-15 20:47:25.385689] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:35.657 [2024-07-15 20:47:27.619367] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:35.657 [2024-07-15 20:47:27.619457] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:35.657 [2024-07-15 20:47:27.619473] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:35.657 [2024-07-15 20:47:27.627384] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:35.657 [2024-07-15 20:47:27.627404] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:35.657 [2024-07-15 20:47:27.627416] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:35.657 [2024-07-15 20:47:27.635407] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:35.657 [2024-07-15 20:47:27.635428] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:35.657 [2024-07-15 20:47:27.635440] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:35.657 [2024-07-15 20:47:27.643429] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:35.657 [2024-07-15 20:47:27.643448] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:35.657 [2024-07-15 20:47:27.643460] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:35.657 Running I/O for 5 seconds... 00:33:36.591 [2024-07-15 20:47:28.615488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:36.591 [2024-07-15 20:47:28.616058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:36.591 [2024-07-15 20:47:28.616275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:36.591 [2024-07-15 20:47:28.616370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:36.591 [2024-07-15 20:47:28.616440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:36.591 [2024-07-15 20:47:28.616906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.618291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.618374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.618439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.618505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.619170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.619231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.619297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.619350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.619826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.621369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.621455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.621508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.621559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.622155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.622228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.622283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.622335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.622794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.624472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.624558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.624613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.624664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.625306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.625368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.625421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.625473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.625898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.627335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.627404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.627457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.627509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.628184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.628253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.628330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.628388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.628753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.629946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.630009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.630061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.630112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.630630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.630689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.630749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.630812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.631158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.632806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.632869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.632921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.632981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.633473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.633535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.633587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.633639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.634037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.635151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.635213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.635265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.635325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.636100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.636160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.636213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.637812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.637886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.591 [2024-07-15 20:47:28.637946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.637999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.638523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.638585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.638637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.638689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.640278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.640341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.640418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.640471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.641104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.641171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.641226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.641278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.642778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.642841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.642893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.642951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.643479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.643537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.643589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.643646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.645087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.645151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.645203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.645255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.645891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.645956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.646016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.646067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.647662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.647729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.647789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.647841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.648337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.648404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.648459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.648510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.650022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.650088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.650140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.650198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.650902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.650971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.651025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.651076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.652727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.652794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.652846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.652902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.653393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.653456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.653508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.653561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.655174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.655260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.655313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.655365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.655941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.656003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.656054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.656106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.657598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.657674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.657726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.657777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.658331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.658395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.658446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.658505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.660177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.660242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.660299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.660354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.660840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.660906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.660970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.661026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.662529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.662593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.662647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.662699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.663243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.663308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.663368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.663419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.665188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.665253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.665306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.665357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.665845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.665908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.665967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.666020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.667675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.667743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.667795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.667852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.668347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.668415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.668467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.668519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.670153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.670217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.670269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.670325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.670813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.670872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.670924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.670983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.672617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.672688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.672742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.672799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.673297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.673356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.673408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.673460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.675067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.675135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.675187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.675238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.675756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.675815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.675867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.592 [2024-07-15 20:47:28.675918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.677451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.677526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.677578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.677629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.678161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.678225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.678276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.678344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.680206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.680282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.680334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.680385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.680912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.680982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.681041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.681103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.682586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.682648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.682700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.682752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.683329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.683397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.683451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.683507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.685548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.685610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.685662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.685713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.686295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.686357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.686414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.686465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.688009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.688076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.688127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.688178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.688723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.688795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.688851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.688902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.690666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.690730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.690782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.690839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.691342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.691404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.691455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.691506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.693117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.693179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.693231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.693288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.693780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.693842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.693893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.693954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.695517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.695580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.695638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.695706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.696197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.696257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.696308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.696360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.697933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.699681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.701430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.701959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.702650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.704454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.706184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.707906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.711038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.712615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.713108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.714318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.716540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.718311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.719597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.721325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.723324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.725265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.727196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.729082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.731324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.733263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.735195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.737110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.740738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.742498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.743848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.745592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.747806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.749159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.749652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.751168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.754285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.756063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.757968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.759908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.593 [2024-07-15 20:47:28.761031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.762756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.764507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.766255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.769443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.770824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.771323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.772805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.775035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.776822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.778315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.780039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.782047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.783798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.785542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.787289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.789665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.791430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.793169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.794596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.797993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.799748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.801050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.802773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.805013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.805837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.806336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.808273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.811416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.813375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.815314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.817217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.818507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.820227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.821964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.823705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.826875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.828023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.828522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.830246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.832570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.834519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.836197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.837954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.840070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.841802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.843551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.845306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.847501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.849246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.850996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.852210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.855457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.857212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.858572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.860297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.862527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.863041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.863593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.865301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.868430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.870214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.871989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.873204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.875157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.876890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.878634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.880376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.883670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.884186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.884681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.886388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.888506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.889943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.891876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.893818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.895820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.897596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.898110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.899838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.900989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.901496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.901999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.902491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.904940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.905459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.905964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.906456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.907530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.908039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.908540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.909060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.911562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.912072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.912564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.913070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.914141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.914646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.915141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.915634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.917874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.918387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.918881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.919380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.920394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.920895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.921396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.921895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.924031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.924552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.925055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.925553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.594 [2024-07-15 20:47:28.926598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.927114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.927607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.928106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.930338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.930868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.931367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.931864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.932889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.933414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.933914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.934443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.936758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.937277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.937775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.938271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.939436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.939952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.940443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.940941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.943443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.943962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.944456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.944954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.946044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.946543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.947044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.947538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.949808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.950323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.950820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.951316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.952326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.952826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.953327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.953823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.957486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.958434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.959879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.961230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.962721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.963832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.965556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.595 [2024-07-15 20:47:28.966051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.968671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.970619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.972563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.974307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.975639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.977381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.979134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.981099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.984532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.985464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.985963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.987746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.990079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.991875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.993611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.995449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.997626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:28.999360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.001106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.003074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.005321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.007080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.009048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.009935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.013187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.015139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.016719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.018444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.020887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.021392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.856 [2024-07-15 20:47:29.022121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.023849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.027040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.028805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.030776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.031674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.033780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.035511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.037352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.039242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.042660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.043170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.043901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.045627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.048068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.049163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.050875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.052619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.055625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.057359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.059164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.061084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.063247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.065030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.066977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.067471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.070977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.072089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.073822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.073872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.075622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.075969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.076741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.077246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.079184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.081052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.082469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.082536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.082588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.082640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.083021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.085119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.085193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.085248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.085301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.087288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.087352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.087403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.087455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.087822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.088015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.088082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.088134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.088185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.089673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.089736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.089788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.089846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.090195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.090380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.090437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.090490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.090563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.092304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.092366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.092417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.092475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.092810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.093004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.093062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.093114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.093165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.094664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.094727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.094779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.094837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.095184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.095364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.095446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.095501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.095552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.097183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.097252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.097317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.097369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.097702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.097881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.097947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.098006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.098066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.099649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.099721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.099777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.099829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.100223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.100404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.100461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.100514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.100566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.857 [2024-07-15 20:47:29.102203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.102269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.102320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.102376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.102710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.102888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.102961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.103015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.103067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.104556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.104619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.104670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.104722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.105208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.105389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.105451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.105504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.105577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.107154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.107223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.107275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.107328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.107662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.107839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.107895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.107956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.108023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.109457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.109519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.109571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.109625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.110117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.110299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.110379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.110433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.110485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.112075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.112138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.112197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.112257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.112601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.112784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.112847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.112899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.112959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.114452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.114523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.114578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.114630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.115159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.115349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.115411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.115464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.115516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.117053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.117117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.117168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.117219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.117623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.117800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.117857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.117909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.117968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.119463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.119527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.119602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.119654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.120135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.120309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.120367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.120419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.120471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.121873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.121948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.122002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.122058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.122508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.122687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.122744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.122798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.122870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.124292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.124355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.124408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.124460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.124984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.125162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.125247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.125301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.125353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.126822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.126891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.126959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.127013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.127350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.127530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.127600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.127656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.127707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.858 [2024-07-15 20:47:29.129252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.129315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.129369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.129421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.129982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.130164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.130221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.130273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.130324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.131844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.131909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.131968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.132031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.132367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.132546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.132603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.132654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.132705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.134330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.134410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.134463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.134515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.134932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.135109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.135170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.135222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.135282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.136700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.136777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.136831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.136884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.137268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.137447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.137504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.137555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.137615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.139203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.139266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.139318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.139371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.139705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.139882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.139954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.140041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.140093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.141588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.141652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.141705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.141761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.142184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.142363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.142427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.142482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.142539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.144322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.144387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.144439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.144492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.144827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.145029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.145096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.145148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.145199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.146755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.146819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.146871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.146923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.147278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.147463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.147525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.147576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.147628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.149236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.149305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.149359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.149415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.149772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.149964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.150022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.150076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.150128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.151680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.151746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.151807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.151869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.152230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.152408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.152464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.152516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.152568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.154248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.154316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.154368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.154419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.154826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.155015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.155077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.155137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.155190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.156606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.859 [2024-07-15 20:47:29.156673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.156724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.156776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.157164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.157348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.157405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.157457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.157535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.159497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.159559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.159611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.159662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.160041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.160219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.160283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.160353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.160409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.161817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.161879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.163608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.163873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.164060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.164118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.164170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.164229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.165939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.167693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.169856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.170944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.171330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.171510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.173246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.175211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.175831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.179294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.181042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.182803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.184653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.185006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.186857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.187363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.188360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.190082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.193294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.195054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.197012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.197812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.198231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.199594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.201337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.201988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.203748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.207498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.209171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.210750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.212634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.212991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.215009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.215512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.216024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.216519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.218624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.219133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.219628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.220147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.220643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.221257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.221768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.222271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.222765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.224912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.225426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.225924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.226421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.226946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.227559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.228064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.228561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.229076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.860 [2024-07-15 20:47:29.231585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:36.861 [2024-07-15 20:47:29.232096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.232591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.233095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.233578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.234195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.234692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.235190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.235683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.237789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.238304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.238797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.239300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.119 [2024-07-15 20:47:29.239746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.240361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.240859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.241375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.241868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.244134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.244637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.245138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.245629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.246076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.246700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.247212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.247724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.248250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.250295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.250821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.251318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.251814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.252360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.252988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.253493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.253997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.254490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.257014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.257519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.258020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.258513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.258930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.259543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.260047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.260547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.261064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.263175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.263677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.264189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.264685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.265249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.265860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.266372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.266866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.268284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.271493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.272003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.272750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.274128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.274531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.275412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.275906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.277177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.278499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.281886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.283741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.284786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.286517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.286882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.288346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.288847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.290308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.292023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.295240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.296991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.298745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.299245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.299802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.301862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.303717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.305471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.307001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.309956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.310462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.311683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.313410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.313840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.315710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.317059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.318785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.320534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.323892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.325849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.327801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.329610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.329958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.332011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.333956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.335775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.336287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.339511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.340838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.342554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.344297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.344700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.345848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.346357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.348147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.350016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.353148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.354987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.356923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.357427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.357996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.359831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.361586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.363332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.364635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.367415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.367917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.369326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.371059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.371432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.373299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.374812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.120 [2024-07-15 20:47:29.376522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.378286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.381937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.383716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.385463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.386972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.387314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.389189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.390947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.392316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.392812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.396026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.397359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.399089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.400828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.401215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.401866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.402375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.404154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.405900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.409183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.410959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.412464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.412965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.413355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.415193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.416949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.418689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.419996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.422207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.422710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.424652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.426559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.426899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.428694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.430609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.432406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.434154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.437940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.439671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.441407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.442736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.443133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.444995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.446748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.447771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.448271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.451559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.453148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.454854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.456481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.456877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.457499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.458018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.459741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.461483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.464684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.466446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.467915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.468415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.468780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.470596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.472349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.474102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.475417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.477638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.478150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.480089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.481977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.482317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.484082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.486024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.487810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.489540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.493363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.495124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.121 [2024-07-15 20:47:29.496887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.446 [2024-07-15 20:47:29.498207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.446 [2024-07-15 20:47:29.498625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.500509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.502269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.503188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.503682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.507068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.508737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.510510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.512409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.512753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.513373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.514133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.515858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.517598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.520800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.522557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.523695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.524204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.524551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.526388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.528210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.530127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.531688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.533679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.534195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.535914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.535977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.536373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.538254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.539562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.541274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.543015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.546012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.546081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.546143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.546196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.546535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.548423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.548489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.548541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.548593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.550087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.550150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.550201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.550261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.550597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.550776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.550856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.550911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.550970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.552512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.552574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.552634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.552693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.553043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.553226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.553282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.553334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.553386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.554946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.555009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.555068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.555135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.555472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.555670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.555729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.555781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.555833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.557384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.557453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.557511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.557563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.557976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.558159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.558219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.558271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.558323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.559876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.559954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.560009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.560064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.560403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.560583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.560641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.560693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.560744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.562334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.562400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.562456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.562514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.562884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.563079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.563137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.563188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.563247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.564756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.564825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.564882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.564942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.447 [2024-07-15 20:47:29.565352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.565534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.565590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.565642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.565695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.567296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.567363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.567414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.567466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.567874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.568061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.568123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.568177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.568237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.569748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.569828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.569884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.569945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.570330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.570510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.570567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.570626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.570679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.572309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.572374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.572426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.572478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.572911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.573102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.573159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.573210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.573261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.574895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.574965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.575020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.575073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.575613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.575797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.575856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.575920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.575995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.577516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.577594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.577649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.577701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.578141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.578323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.578385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.578437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.578494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.580108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.580171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.580230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.580282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.580726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.580905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.580969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.581035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.581099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.582809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.582894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.582955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.583022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.583517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.583699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.583756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.583833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.583898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.585676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.585741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.585795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.585849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.586408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.586600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.586669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.586734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.586815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.588492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.588555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.588644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.588697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.589178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.589363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.589458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.589524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.589577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.591506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.591581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.591647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.591702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.592237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.592422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.592480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.592557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.592624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.594502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.594565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.594619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.594671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.595241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.448 [2024-07-15 20:47:29.595447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.595516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.595581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.595661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.597320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.597383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.597440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.597505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.597983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.598169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.598241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.598308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.598361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.600255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.600330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.600414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.600469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.600948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.601130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.601193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.601261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.601326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.603194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.603258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.603310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.603363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.603894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.604088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.604159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.604223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.604289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.605986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.606049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.606102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.606154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.606615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.606798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.606860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.606924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.606997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.608573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.608637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.608691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.608760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.609233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.609412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.609469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.609523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.609575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.611407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.611482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.611535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.611587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.612007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.612188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.612250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.612302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.612367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.614533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.614595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.614647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.614700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.615161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.615345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.615403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.615457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.615521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.617344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.617435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.617488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.617541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.617996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.618196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.618264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.618322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.618373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.620018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.620081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.620150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.620215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.620676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.620855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.620910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.620971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.621025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.622753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.622822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.622874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.622934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.623311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.623491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.623553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.623606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.623660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.625851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.625954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.626024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.626078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.626481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.626659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.449 [2024-07-15 20:47:29.626736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.626802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.626854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.628444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.628525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.628578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.628630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.629170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.629353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.629412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.629464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.629516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.631395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.631459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.631513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.632012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.632466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.632645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.632703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.632774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.632828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.634553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.635066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.635566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.636068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.636541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.636721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.637228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.637723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.638219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.640708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.641223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.641720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.642218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.642711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.643931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.645658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.647405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.649272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.652841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.653353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.653851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.655782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.656184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.657258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.658575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.659981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.660493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.662583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.663099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.664960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.666918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.667287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.669053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.670829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.672769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.673272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.676848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.678023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.679752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.681505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.681845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.683027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.683529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.685003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.686722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.689945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.691777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.693665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.694164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.694688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.696614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.698367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.700326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.701532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.704096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.704598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.705955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.707676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.708056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.710145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.711547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.713256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.714987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.718548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.720497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.722470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.723906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.724252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.726313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.728286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.729691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.730193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.733737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.734848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.736551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.738290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.738630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.739254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.739756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.741697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.450 [2024-07-15 20:47:29.743647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.746986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.748967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.750393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.750889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.751315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.753151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.754905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.756856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.757958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.759984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.760488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.762426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.764356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.764697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.766286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.768227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.770105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.772068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.775803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.777552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.779515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.780620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.781028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.782882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.784848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.785656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.786157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.789599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.791221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.793004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.794940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.795280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.795901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.796785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.798493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.800232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.803533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.805501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.806114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.806605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.806951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.808965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.810906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.812583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.814380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.816352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.817178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.818874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.820598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.820943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.451 [2024-07-15 20:47:29.822165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.823887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.825620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.827605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.830807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.832741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.834544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.836254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.836667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.838699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.840466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.840963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.711 [2024-07-15 20:47:29.841586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.844091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.845819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.847569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.849532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.850010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.850622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.852228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.853948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.855754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.859113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.860914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.861413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.862076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.862459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.864332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.866308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.867394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.869100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.871151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.872732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.874455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.876269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.876618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.878371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.880160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.882044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.883873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.887411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.889324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.890646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.892513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.892854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.894947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.896219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.896715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.897872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.900564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.902286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.904030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.906000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.906460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.907080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.909028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.910969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.912954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.916305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.917849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.918370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.919247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.919627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.921480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.923450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.924514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.926246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.928345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.930192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.932099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.934037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.934379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.936229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.938105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.940045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.941741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.945362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.947345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.948476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.950202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.950576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.952659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.953700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.954201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.955605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.958516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.960230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.961957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.963886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.964357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.964979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.966943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.968786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.970757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.974175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.975598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.976103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.977107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.977494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.979348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.981331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.982340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.984050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.986134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.986641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.987145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.987644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.988069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.989921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.991316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.992962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.994712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.997937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:29.999777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.000962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.002694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.003109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.003959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.004463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.004962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.005458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.007916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.008429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.008935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.009432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.009959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.010578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.011089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.011591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.012105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.712 [2024-07-15 20:47:30.014506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.015022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.015514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.016018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.016560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.017322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.017830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.018367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.018864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.021011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.021516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.022021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.022518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.022999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.023613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.024133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.024631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.025155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.027522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.028038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.028536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.028599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.029089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.029708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.030217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.030726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.031226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.034893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.034977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.035031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.035084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.035507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.036140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.036208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.036265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.036321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.037971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.038048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.038104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.038169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.038600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.038786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.038855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.038908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.038967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.040681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.040745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.040797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.040862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.041367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.041558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.041628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.041704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.041769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.043524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.043587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.043641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.043694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.044148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.044340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.044410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.044487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.044543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.046250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.046314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.046367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.046421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.046853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.047047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.047110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.047176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.047229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.048805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.048868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.048921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.048980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.049401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.049584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.049642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.049695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.049747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.051458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.051524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.051601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.051654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.052108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.052293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.052365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.052420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.052481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.054313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.054376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.054428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.054499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.055033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.055217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.055286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.055353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.055419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.057387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.057449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.057501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.057553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.058106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.058292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.058366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.058433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.058488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.060363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.060429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.060480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.060532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.060958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.061139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.061195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.061250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.061315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.063082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.063145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.063203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.713 [2024-07-15 20:47:30.063259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.063600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.063783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.063841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.063894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.063954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.065726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.065789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.065844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.065896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.066242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.066425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.066482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.066534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.066589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.068224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.068287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.068339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.068391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.068731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.068921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.068992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.069045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.069097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.070845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.070941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.070995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.071046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.071439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.071616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.071677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.071758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.071812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.073419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.073484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.073549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.073615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.073959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.074142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.074199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.074251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.074303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.075907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.075980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.076033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.076085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.076424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.076603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.076660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.076720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.076781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.078277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.078346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.078398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.078449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.078815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.079008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.079070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.079122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.079191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.080954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.081026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.081078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.081130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.081528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.081720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.081779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.081831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.081884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.083793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.083856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.083908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.083970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.084349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.084535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.084592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.084652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.084707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.086237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.086313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.086373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.086425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.086817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.087011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.087069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.087122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.714 [2024-07-15 20:47:30.087173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.088776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.088841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.088894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.088954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.089294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.089477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.089538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.089617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.089671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.091255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.091318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.091370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.091421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.091767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.091958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.092016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.092068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.092120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.093778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.093843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.093895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.093960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.094302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.094480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.094536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.094587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.094639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.096140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.096209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.096263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.096318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.974 [2024-07-15 20:47:30.096655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.096834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.096891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.096951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.097015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.098905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.098978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.099031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.099083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.099451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.099628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.099690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.099757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.099810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.101346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.101408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.101460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.101512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.101851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.102055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.102116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.102168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.102221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.104215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.104277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.104329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.104388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.104726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.104909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.104975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.105027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.105079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.106774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.106837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.106888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.106967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.107305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.107486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.107542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.107594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.107645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.161888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.161978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.162446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.162941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.165320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.167282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.168494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.170431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.172432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.173773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.175496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.177444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.179169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.180884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.182839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.184782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.188493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.190469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.191577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.193455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.195867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.196977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.197473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.198553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.201158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.202864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.204826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.206763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.207877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.209725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.211659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.213621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.217159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.218041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.218534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.220054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.222501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.224321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.225802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.227506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.229632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.231366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.233326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.235278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.975 [2024-07-15 20:47:30.237623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.239599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.241555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.242352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.245894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.247674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.249170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.250870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.253197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.253697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.254243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.255981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.259278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.261243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.263201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.263860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.265913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.267634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.269575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.271324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.274509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.275017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.275752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.277466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.279865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.280756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.282461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.284425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.287648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.289419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.291382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.292981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.295134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.297085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.298744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.299241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.302822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.303676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.305397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.307348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.308444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.308956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.310746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.312618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.315993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.317970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.319312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.319822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.322053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.324042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.326004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.326936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.328867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.329374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.331257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.333215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.334821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.336717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.338667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.340626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.344318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.346308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.348277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:37.976 [2024-07-15 20:47:30.349193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.351647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.353544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.354054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.354545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.357471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.359202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.361146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.361646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.362804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.363321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.363810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.365186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.368604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.370459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.371350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.371840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.373989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.375933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.376774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.378489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.381003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.382243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.383533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.384347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.385443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.385957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.386450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.386949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.389301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.389808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.390308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.390800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.391808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.392320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.392828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.393331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.395578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.396093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.396590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.397095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.398222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.398741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.399241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.399732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.402203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.402707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.403224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.403720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.404778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.405290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.405786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.406301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.408692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.409205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.409704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.410208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.411188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.411689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.412192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.412697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.414901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.415409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.415903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.416402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.417620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.418141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.418648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.419142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.421606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.422129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.422624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.423123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.424222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.424724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.425224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.425718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.427947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.428468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.428998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.429503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.430607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.431122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.431617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.432115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.436092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.437800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.438918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.440872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.441924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.443543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.445269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.237 [2024-07-15 20:47:30.447208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.447240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.447654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.449307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.449379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.449870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.451786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.454260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.454760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.455267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.457162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.457472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.458732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.459260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.459328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.459822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.462104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.463852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.463938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.465904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.466426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.467920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.469892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.469968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.470901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.471468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.473445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.473514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.474414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.474865] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.476134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.478125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.478194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.479184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.479697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.481084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.481154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.483114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.483576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.484964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.486710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.486786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.488759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.489356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.491219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.491287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.493236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.493582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.494971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.496944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.497015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.498968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.238 [2024-07-15 20:47:30.499515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.500395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.500455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.502178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.502518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.504249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.504320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.506024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.506082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.506586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.508541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.508604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.508658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.509001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.511594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.511682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.511736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.511788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.512426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.512487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.512540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.512601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.512941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.514124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.514202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.514256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.514309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.514804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.514870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.514952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.515006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.515335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.516669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.516734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.516787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.516853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.517564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.517625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.517677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.517728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.518122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.519286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.519349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.519414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.519466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.520022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.520081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.520133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.238 [2024-07-15 20:47:30.520189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.520522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.521943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.522006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.522059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.522111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.522607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.522674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.522734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.522790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.523130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.524468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.524532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.524584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.524636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.525212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.525276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.525328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.525379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.525710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.527084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.527147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.527205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.527256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.527802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.527861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.527913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.527974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.528305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.529481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.529569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.529627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.529679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.530190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.530251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.530309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.530365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.530698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.532397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.532460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.532511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.532563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.533124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.533188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.533241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.533296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.533626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.534998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.535060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.535112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.535170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.535667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.535727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.535779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.535831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.536291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.537505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.537585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.537641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.537693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.538197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.538263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.538322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.538379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.538711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.539838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.539907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.539971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.540023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.540519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.540578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.540629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.540690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.541173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.542480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.542543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.542595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.542647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.543151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.543223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.543275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.543331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.543710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.548584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.548649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.548702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.548754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.549413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.549472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.549531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.549582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.549997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.553961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.554041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.554093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.554151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.554651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.554709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.554761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.239 [2024-07-15 20:47:30.554812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.555287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.558588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.558652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.558704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.558755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.559409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.559469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.559521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.559571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.559948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.563725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.563790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.563841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.563892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.564432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.564492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.564552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.564612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.564951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.569282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.569346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.569398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.569449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.570091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.570152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.570205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.570262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.570742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.575658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.575721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.575773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.575825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.576387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.576454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.576510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.576562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.576892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.580519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.580582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.580656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.580708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.581211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.581269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.581321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.581380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.581847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.586815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.586882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.586942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.586999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.587660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.587719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.587771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.587831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.588222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.592527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.592600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.592653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.592705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.593208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.593267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.593328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.593381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.593915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.597425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.597490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.597549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.597605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.598167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.598226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.598285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.598338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.598667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.602335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.602399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.602451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.602509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.603023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.603082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.603135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.603186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.603519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.607512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.607579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.607637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.607689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.608378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.608440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.608492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.610042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.240 [2024-07-15 20:47:30.610424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.616626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.616700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.616757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.617999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.618714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.618779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.618833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.618886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.619233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.623852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.625841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.625906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.627846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.628460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.628521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.629022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.629289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.634118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.635965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.636032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.637993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.638576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.639085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.639151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.640706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.641094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.641596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.643536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.643599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.645171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.645706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.647685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.647746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.648966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.649524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.650838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.652782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.652842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.652889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.653461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.655326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.655385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.657339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.657698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.660426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.660495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.662309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.662391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.662889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.663982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.664046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.665744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.500 [2024-07-15 20:47:30.666090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.667279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.667786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.667841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.669765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.670270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.672244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.672435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.673657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.675605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.677245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.677731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.678397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.680111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.681256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.683172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.683518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.685966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.686832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.688636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.689148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.690546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.692178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.694139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.695028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.695371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.696942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.698047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.699750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.700667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.703079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.703584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.704090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.706031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.706378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.708647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.709157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.709650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.710153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.711323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.711833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.712338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.712827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.713342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.715362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.715869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.716370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.716868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.717938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.718437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.718938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.719433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.719919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.721704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.722211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.722711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.723211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.724262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.724773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.725285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.725779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.726259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.728080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.728591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.729094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.729587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.730632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.731145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.731644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.732143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.732670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.734483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.734994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.735493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.735991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.737050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.737558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.738051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.738548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.501 [2024-07-15 20:47:30.739071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.901470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.901569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.903354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.909872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.910275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.910662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.912333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.912399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.914027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.914081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.915678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.915733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.917266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.917681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.917706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.917720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.923369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.923766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.924160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.924546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.926191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.927836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.929481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.930934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.931303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.931319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.931334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.936900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.937302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.937689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.938080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.939724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.941357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.942987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.944458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.944854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.944870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.944885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.950572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.950974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.951362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.951749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.953825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.955015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.762 [2024-07-15 20:47:30.955897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.957245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.957514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.957531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.957545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.958992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.960780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.961934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.963033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.965330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.967139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.968796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.969263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.969776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.969794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.969809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.972174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.974061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.974463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.976114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.978187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.979662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.980060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.980446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.980912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.980935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.980951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.986976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.988482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.989202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.989592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.990605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.992211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.993906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.995556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.995901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.995919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:30.995939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.000358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.000772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.001596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.003100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.003845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.005747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.006950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.007970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.008344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.008362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.008377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.012678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.013528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.014920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.016804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.017684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.018091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.019107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.020317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.020796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.020813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.020828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.025408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.025809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.026211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.026602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.027523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.027939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.028333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.028383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.028779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.028796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.028811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.030951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.031355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.031745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.032137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.033029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.033090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.033475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.033524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.033900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.033916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.033939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.036298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.036698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.037095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.037490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.038456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.038515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.038901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.038954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.039386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.039403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.039422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.041683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.042088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.042480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.042870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.043790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.763 [2024-07-15 20:47:31.043846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.044258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.044647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.045059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.045076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.045090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.047201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.047604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.048004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.048386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.048946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.049374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.049428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.049818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.050176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.050193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.050208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.052532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.052937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.053342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.053730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.054308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.054708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.055106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.055159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.055577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.055594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.055609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.057864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.058274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.058665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.059060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.059922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.060323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.060369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.060752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.061153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.061597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.061615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.061629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.061643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.064153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.064549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.064942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.065337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.065933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.066333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.066382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.066768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.067132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.067149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.067164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.067179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.069092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.069488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.069537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.069924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.070456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.070851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.070901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.071293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.071596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.071613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.071627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.071642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.073517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.073938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.073984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.074372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.074954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.075350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.075400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.075779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.076073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.076091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.076106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.076121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.078018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.078412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.078797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.079830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.080249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.080652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.080710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.081099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.081537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.081558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.081573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.081587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.084607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.086333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.086809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.086858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.087489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.087886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.087937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.089709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.090049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.090066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.764 [2024-07-15 20:47:31.090080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.090094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.091454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.091514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.091896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.091968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.092637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.094293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.094341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.095488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.095756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.095774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.095790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.095805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.098213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.098272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.099369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.100523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.100942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.101347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.101398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.103204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.103481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.103498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.103512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.103526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.105856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.107500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.108974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.109022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.109591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.109611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.109871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.109887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.110049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.111199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.111250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.112236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.112620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.112636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.112651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.112665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.116608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.116665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.117812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.117861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.118123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.118140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.118305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.118699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.118743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.119137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.119476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.119493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.119508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.119522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.123199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.123251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.123291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.123336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.123642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.123658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.123811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.124567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.124620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:38.765 [2024-07-15 20:47:31.125031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.125457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.125482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.125504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.125525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.128504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.128562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.128609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.128658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.129093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.129117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.129278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.130601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.133766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.134995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.135872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.136380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:38.765 [2024-07-15 20:47:31.136913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.023 [2024-07-15 20:47:31.138736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.023 [2024-07-15 20:47:31.140429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.141151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.144532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.145629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.147284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.148683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.149799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.151578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.153534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.154967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.157342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.159200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.160651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.162614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.164641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.165167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.166458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.168300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.170502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.171024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.172977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.174161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.176598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.177143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.177655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.178915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.181279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.182112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.183398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.185065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.186184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.186959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.188295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.189986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.193614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.194783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.195722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.196228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.198782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.199914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.200896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.201406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.204501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.205018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.205523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.207482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.209552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.210058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.210553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.212400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.214128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.214172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.215194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.216252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.218044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.218453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.219733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.220247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.222175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.224076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.226106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.226773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.228384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.229816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.230163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.230357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.231487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.232469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.232503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.233000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.024 [2024-07-15 20:47:31.233308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.235608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.237569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.238083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.238674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.239024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.240386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.242195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.243455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.244305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.244760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.247489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.249131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.249644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.250163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.250507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.252275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.252949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.254376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.255333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.255749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.257975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.258852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.260781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.261693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.262080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.263339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.265287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.267234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.024 [2024-07-15 20:47:31.268687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.269253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.272059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.273324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.275258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.277173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.277652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.278806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.280787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.281316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.282981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.283353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.286141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.287607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.288257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.290116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.290460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.291903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.293534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.294469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.295769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.296122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.299358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.300569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.302193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.303122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.303478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.305109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.305660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.307536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.308536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.308938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.311020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.312022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.313540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.314193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.314558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.316171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.318090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.318965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.320193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.320592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.323060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.325050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.326044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.327172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.327570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.329635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.331212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.333154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.335073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.335556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.337198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.338935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.340673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.342638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.343080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.344831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.346163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.348057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.348935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.349352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.352171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.354121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.355233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.357087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.357630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.359224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.360832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.361343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.363067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.363424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.366471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.368065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.369912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.370726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.371108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.372753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.374487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.376373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.378220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.378609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.381185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.382848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.383359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.385087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.385479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.387588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.388419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.390281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.391677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.392028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.394728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.396459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.398293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.400129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.025 [2024-07-15 20:47:31.400562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.402561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.403581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.404667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.406606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.407125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.410130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.411044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.412721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.414034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.414541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.416637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.418594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.420560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.421989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.422336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.425315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.426482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.427429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.427493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.427839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.429739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.431722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.432791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.434509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.434916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.437444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.439079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.439149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.439854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.440243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.286 [2024-07-15 20:47:31.442144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.444118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.445192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.446922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.447326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.448539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.449356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.449427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.450983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.451332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.453206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.453281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.455167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.456772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.457130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.458231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.459457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.459525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.460793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.461147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.461895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.463621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.465367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.467310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.467744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.468845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.470807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.470875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.472306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.472837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.473045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.474580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.474646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.476215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.476601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.477753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.479582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.481531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.481600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.481949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.482147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.483368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.483436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.484462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.484809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.487918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.488000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.489981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.491045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.491431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.491628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.493377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.493444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.495302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.495650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.497135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.498491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.498559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.500322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.500696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.500899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.502268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.502346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.504069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.504451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:39.287 [2024-07-15 20:47:31.505637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.505700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.505752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.506480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.508075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.508133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.509535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.509953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.511097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.511169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.511221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.511273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.511785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.511845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.511898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.511959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.512362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.513565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.513628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.513687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.513742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.514369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.514429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.514482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.514534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.515085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.516197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.516260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.516312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.516364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.516876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.516944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.287 [2024-07-15 20:47:31.516998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.517050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.517387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.518650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.518715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.518776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.518838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.519480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.519542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.519594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.519649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.520202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.521307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.521370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.521422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.521473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.521994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.522055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.522108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.522164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.522501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.523749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.523811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.523874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.523940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.524566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.524626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.524678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.524730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.525276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.526382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.526446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.526499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.528463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.529556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.530018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.530225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.530284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.530347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.530408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.530879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.532061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.532124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.532177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.532233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.532570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.532756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.532821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.532880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.534615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.534960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.536078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.536146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.536200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.536252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.536647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.536837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.536903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.538378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.538437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.538961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.540225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.540289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.540341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.540402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.540738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.540944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.541007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.541059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.541120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.541463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.542699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.542762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.542813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.542865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.543352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.543542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.543598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.543650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.543706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.544266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.545394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.545458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.545517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.545571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.545905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.546102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.546161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.546213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.546266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.546605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.547889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.547965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.548018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.548070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.548408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.548594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.548651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.548709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.288 [2024-07-15 20:47:31.548765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.549120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.550364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.550431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.550484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.550537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.550936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.551128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.551185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.551240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.551303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.551645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.552914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.552985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.553041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.553107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.553517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.553705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.553772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.553826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.553879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.554263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.555464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.555541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.555596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.555651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.555997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.556184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.556240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.556298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.556350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.556742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.557865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.557940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.557994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.558047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.558386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.558576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.558641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.558693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.558745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.559097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.560271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.560337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.560389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.560442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.560895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.561087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.561146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.561198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.561255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.561712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.562857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.562920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.562983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.563036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.563369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.563551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.563616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.563671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.563727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.564112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.565209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.565286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.565346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.565402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.565739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.565924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.565996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.566049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.566101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.566537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.567870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.567950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.568004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.568056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.568423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.568607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.568663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.568726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.568805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.569158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.570377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.570442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.570508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.570564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.570900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.571097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.571158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.571211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.571268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.571607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.572963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.573034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.573100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.573155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.573494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.573680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.573742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.289 [2024-07-15 20:47:31.573794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.573846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.574243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.575287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.575358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.576417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.576477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.576985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.577170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.577235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.577299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.577358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.577707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.578799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.580656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.580734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.582232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.582752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.582952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.583011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.583065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.583120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.583602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.588009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.589314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.589377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.590424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.590768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.590958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.592648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.592732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.592786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.593239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.594318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.595394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.595455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.597005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.597343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.597526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.597590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.597659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.597711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.598060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.599244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.599748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.599810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.601241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.601582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.603508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.603586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.604585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.604643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.605160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.608219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.608290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.609058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.609121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.609454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.610706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.610774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.611465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.611527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.611866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.617567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.617640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.617693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.618192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.618597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.620521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.620589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.621791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.621849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.622361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.625982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.627622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.627676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.627950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.628857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.628924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.629998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.630057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.630488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.290 [2024-07-15 20:47:31.634180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.636162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.636668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.637269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.637651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.639284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.639356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.641008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.641327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.645892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.647516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.648865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.649648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.650110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.650293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.651042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.652399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.653950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.654494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.657359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.657867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.658374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.658868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.659324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.659947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.660448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.660954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.661460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.291 [2024-07-15 20:47:31.661904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.664666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.665234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.665735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.666236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.666665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.667291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.667797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.668306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.670208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.670687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.674292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.674804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.675306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.676641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.677058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.677982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.678487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.678989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.680451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.680903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.684717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.685254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.551 [2024-07-15 20:47:31.685749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.686675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.687088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.688746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.689276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.689772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.690301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.690650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.694723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.696361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.697629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.698130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.698545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.700317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.701709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.703453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.705203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.705761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.710223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.711774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.712351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.712412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.712762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.713945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.714973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.716663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.718359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.718828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.721640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.723266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.724672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.726284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.726751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.727382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.729216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.730819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.731457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.731858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.736263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.737908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.739041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.739532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.739878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.740070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.740573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.742458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.742730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.747029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.748879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.750775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.751271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.751829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.752026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.753744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.755599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.757566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.757999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.759547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.760424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.762148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.763786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.764229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.766306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.768254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.768752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.769272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.769623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.772711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.773226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.773718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.775368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.775803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.777866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.779278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.780857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.781578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.782042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.787621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.789031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.789837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.790338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.790682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.792554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.794450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.796271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.797918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.798386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.803512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.805463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.806864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.808256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.808722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.809350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.810874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.552 [2024-07-15 20:47:31.812593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.814361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.814710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.819338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.821255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.823070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.824708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.825143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.827211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.828853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.829475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.830990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.831346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.837879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.839710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.841648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.842942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.843421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.845494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.846252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.847963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.849679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.850031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.854899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.855413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.857045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.858772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.859162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.861116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.862774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.864519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.866377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.866724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.872468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.873576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.875309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.877047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.877392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.878438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.878947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.880662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.882505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.882859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.889554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.890065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.890660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.892376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.892783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.894918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.895997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.897715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.899453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.899802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.904990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.906943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.908417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.910137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.910523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.912581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.913090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.913720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.915430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.915853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.922538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.923050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.923834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.925502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.925882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.553 [2024-07-15 20:47:31.927968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.928952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.930361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.931865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.932410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.936747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.938154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.939717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.940216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.940675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.942505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.944240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.946207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.947207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.947593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.952835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.954583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.956551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.957375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.957756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.959113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.959611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.961025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.962725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.963113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.969810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.970998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.971942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.972000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.972344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.974409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.976354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.977853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.979779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.980137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.985555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.986838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.986898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.988280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.988725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.989356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.990946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.992334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.993273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.993684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.997356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.998763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:31.998822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:32.000243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:32.000590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.816 [2024-07-15 20:47:32.001968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.003362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.004391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.004452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.004931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.009541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.010958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.011020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.012096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.012578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.013591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.015229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.016365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.018078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.018425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.019818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.021771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.021836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.023680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.024041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.025578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.026239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.026302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.026792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.027146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.031236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.031976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.032473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.032531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.032906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.033100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.035034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.035095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.036558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.036906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.041886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.041961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.042896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.044295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.044651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.044847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.045354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.045411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.046956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.047392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.051212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.052427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.052491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.053383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.053794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.053995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.055247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.055310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.055793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.056191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.061254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.061324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.061377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.061430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.061953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.062144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.063809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.063878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.064374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.064934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.068511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.068578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.068633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.068686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.069110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.069302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.070259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.070322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.070374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.070756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.073310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.073381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.073440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.073503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.073848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.074051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.074130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.074183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.074236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.074712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.078125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.078190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.078242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.078295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.078856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.079068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.079139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.079205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.079281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.079717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.082859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.082954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.083028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.083082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.817 [2024-07-15 20:47:32.083650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.083833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.083891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.083951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.084004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.084436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.086805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.086888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.086971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.087036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.087554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.087750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.087810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.087863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.087915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.088302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.090658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.090738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.090794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.090864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.091431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.091621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.091684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.091737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.091790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.092211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.094551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.094622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.094688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.095200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.095733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.095933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.096005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.096073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.096128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.096608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.100500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.100571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.100630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.100683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.101106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.101294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.101351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.101413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.101467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.102029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.105650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.105716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.105793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.105847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.106313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.106498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.107011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.107074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.107127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.107597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.110087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.110158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.110225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.110290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.110631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.111265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.111331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.111383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.111437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.111882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.114673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.114750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.114803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.114860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.115283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.115471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.115533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.115585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.115638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.116088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.119348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.119414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.119466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.119518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.119976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.120163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.120220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.120272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.120324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.120702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.123935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.124003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.124066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.124120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.124497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.124683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.124743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.124802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.124868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.125220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.128682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.128746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.128798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.128855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.818 [2024-07-15 20:47:32.129245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.129433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.129498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.129551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.129604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.130136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.132682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.132746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.132798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.132856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.133368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.133554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.133638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.133692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.133744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.134148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.138120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.138193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.138246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.138297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.138814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.139011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.139069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.139121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.139175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.139517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.143965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.144030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.144088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.144140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.144578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.144763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.144820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.144872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.144924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.145314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.149095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.149160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.149214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.149266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.149725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.149910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.149976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.150028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.150081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.150505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.154002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.154068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.154120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.154171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.154581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.154766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.154828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.154889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.154949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.155326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.158650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.158714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.158779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.158833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.159188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.159377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.159434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.159486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.159538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.160021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.163885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.163963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.164021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.164072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.164430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.164617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.164683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.164739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.164792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.165221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.168515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.168580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.168632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.168691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.169053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.169244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.169300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.169352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.169404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.169743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.173410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.173474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.173527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.173579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.173995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.174186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.174248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.174301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.174353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.174727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.178199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.178268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.178320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.180054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.819 [2024-07-15 20:47:32.180434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.180620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.180678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.180737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.180791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.181140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.184449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.184519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.186244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.186302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.186681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.186866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.186924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.186983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.187048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.820 [2024-07-15 20:47:32.187389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.192155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.192232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.194160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.194221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.194565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.194760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.194822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.194873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.196275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.196623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.201761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.201831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.202757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.202817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.203231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.203416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.203474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.203526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.203591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.204141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.208816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.208889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.210595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.210652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.211032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.211216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.211275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.211765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.211832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.212181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.218599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.218690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.218744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.219944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.220445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.221832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.221906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.223618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.223684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.224035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.228814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.230371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.230450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.230504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.080 [2024-07-15 20:47:32.230958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.232491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.232559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.234358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.234423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.234778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.241246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.241314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.241811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.241865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.242218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.244050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.244116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.245944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.246008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.246355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.250862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.251390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.252466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.254183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.254562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.256426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.256491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.257871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.257935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.258327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.263638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.265318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.267066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.268433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.268778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.270645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.270764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.276118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.277870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.279315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.281153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.281497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.281565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.283529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.285011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.285516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.285944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.291976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.293306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.293799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.294993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.295376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.297242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.298977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.300228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.301608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.302021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.307142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.308407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.309806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.310865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.311340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.312869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.314590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.316331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.318148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.318565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.323975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.325817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.327760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.329696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.330102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.331855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.332676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.333176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.334993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.335355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.341836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.342353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.342846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.344588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.344942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.346814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.348149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.350087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.352031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.352561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.357344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.358531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.359030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.359088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.359434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.360955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.361857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.363269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.364881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.365459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.369046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.370784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.372722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.373225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.373777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.375616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.377560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.379419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.381198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.381643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.386004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.387404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.388987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.389478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.389955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.391777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.393678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.395432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.396944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.397449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.401659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.403075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.404270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.404757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.405201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.405383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.407099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.408830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.409154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.414653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.416601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.417914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.419301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.419726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.419911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.420413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.422056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.423484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.423919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.429092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.430902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.081 [2024-07-15 20:47:32.432201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.433586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.433936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.434697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.435204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.436589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.437319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.437800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.441364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.441871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.442368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.443627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.444100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.444720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.445228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.446810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.448617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.449171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.452912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.453418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.453916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.455598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.455950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.456566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.457073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.082 [2024-07-15 20:47:32.457572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.459491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.460068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.464295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.464800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.465325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.465820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.466325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.466952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.467467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.467967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.468468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.468970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.471641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.472156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.472655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.473161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.473674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.474967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.475868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.343 [2024-07-15 20:47:32.476364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.476859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.477259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.482007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.483666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.484262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.484752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.485217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.487043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.487549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.488049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.488545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.488894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.494260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.495881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.496439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.496940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.497437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.499296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.499803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.500306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.501669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.502152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.507640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.509600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.510660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.512314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.512798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.513422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.515029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.516419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.517146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.517499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.522065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.523700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.524853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.525354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.525778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.527328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.528685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.530634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.532575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.533060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.537754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.539156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.539651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.540150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.540499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.542011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.543649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.544300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.544795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.545231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.549517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.551328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.551830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.553465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.553816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.554442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.555358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.556769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.558171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.558553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.563699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.565643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.566606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.568248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.568669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.569298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.570025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.571746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.573023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.573411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.578765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.579890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.581614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.581673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.582056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.584117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.584621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.585146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.586723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.587144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.592466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.594290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.594352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.595528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.595903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.597680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.598946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.599465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.600490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.600905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.605090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.605985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.606048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.606536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.606881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.608951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.344 [2024-07-15 20:47:32.610902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.612709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.614571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.614963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.618625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.620425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.620484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.621886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.622244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.624308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.625909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.626422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.626480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.626840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.631537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.633172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.633249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.634046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.634502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.636205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.637960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.639860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.641804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.642159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.645548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.647279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.649024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.649082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.649519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.651393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.653362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.653423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.653991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.654421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.660359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.660430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.662347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.664317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.664665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.664866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.665380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.665442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.666853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.667260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.671539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.673399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.673467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.675060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.675628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.675817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.676816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.676876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.678595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.679030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.685639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.685715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.685770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.685823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.686392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.686585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.687543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.687602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.689324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.689741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.694442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.694506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.694567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.694622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.695003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.695195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.695693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.695753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.697134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.697547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.701778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.701845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.701905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.701969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.702313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.702501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.703015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.703082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.703136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.703484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.708315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.708387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.708441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.708505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.708849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.709047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.709106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.709178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.709242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.709732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.713674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.713748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.713807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.713860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.714212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.714414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.345 [2024-07-15 20:47:32.714484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.714542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.714593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.714971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.718441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.718505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.718558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.718618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.718969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.719157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.719214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.719277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.346 [2024-07-15 20:47:32.719329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.719754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.723467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.723532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.723584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.723637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.724000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.724193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.724251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.724303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.724355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.724732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.728858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.728923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.728983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.730524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.730904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.731099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.731166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.731222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.731273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.731656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.734575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.734646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.734699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.734753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.735105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.735292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.735355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.735413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.735474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.735866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.738337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.738414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.738471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.738530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.738873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.739070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.739130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.739183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.739249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.739738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.741900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.741973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.742027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.742087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.742431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.742619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.744308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.744377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.744445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.744911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.747885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.747966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.748021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.748074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.748415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.750331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.750428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.750490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.750550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.751095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.752562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.608 [2024-07-15 20:47:32.752631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.752682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.752734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.753134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.753322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.753383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.753435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.753486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.753913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.755250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.755314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.755372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.755427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.755817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.756017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.756075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.756127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.756178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.756624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.757963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.758026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.758078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.758150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.758628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.758818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.758878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.758954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.759009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.759429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.760728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.760796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.760848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.760901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.761335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.761528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.761586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.761638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.761699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.762191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.763449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.763524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.763582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.763634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.764026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.764217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.764283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.764336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.764387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.764809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.766353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.766417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.766475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.766529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.766871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.767070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.767133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.767189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.767240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.767618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.768811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.768875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.768934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.768987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.769473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.769658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.769716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.769767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.769822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.770367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.771572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.771635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.771687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.771738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.772123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.772310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.772368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.772421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.772473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.772882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.774217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.774281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.774341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.774397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.774772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.774968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.775032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.775084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.775136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.775550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.776890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.776962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.777020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.777095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.777593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.777779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.777845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.777915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.777976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.778398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.779607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.609 [2024-07-15 20:47:32.779672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.779724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.779775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.780198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.780385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.780442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.780495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.780554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.781057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.782446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.782510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.782561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.782614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.782993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.783186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.783250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.783304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.783368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.783714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.785338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.785407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.785460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.787173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.787596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.787786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.787854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.787906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.787965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.788377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.789876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.789947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.790599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.790658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.791042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.791227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.791285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.791337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.791397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.791810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.795421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.795499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.796991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.797048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.797603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.797789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.797848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.797902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.797962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.798461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.800643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.800722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.801226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.801298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.801748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.801946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.802019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.802095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.803770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.804284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.807478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.807556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.809251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.809312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.809880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.810074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.810133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.810186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.810243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.810712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.812453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.812523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.812576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.813077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.813428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.813613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.813679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.815473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.815543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.816047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.817382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.818899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.818970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.819039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.819537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.820164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.820233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.820772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.820832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.821232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.822983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.823053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.823547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.823607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.824043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.824661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.824729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.825231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.825298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.825793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.827145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.827656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.610 [2024-07-15 20:47:32.828164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.828665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.829212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.829830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.829896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.830398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.830465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.830853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.832581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.833096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.834832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.836564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.837059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.837673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.837739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.838246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.838314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.838659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.840313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.840834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.842700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.843208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.843698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.844319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.844439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.846033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.846539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.847411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.848654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.849183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.849255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.849752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.851020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.852284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.852676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.854405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.856255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.858194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.859437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.859816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.860783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.861303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.862640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.864046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.864463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.866012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.866518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.868148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.869111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.869505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.870758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.871268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.871964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.873370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.873751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.875472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.877212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.878707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.879338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.879693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.881380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.881902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.882856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.884027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.884375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.886061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.886565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.887889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.887955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.888387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.890235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.891904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.892408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.892907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.893261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.894433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.895397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.895895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.897039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.897501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.898850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.899994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.901609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.902532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.902956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.906815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.908462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.910001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.910516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.910987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.912841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.914048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.914961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.915458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.915876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.920602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.922361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.924333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.925453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.925943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.611 [2024-07-15 20:47:32.926559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.928501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.930126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.931841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.932229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.935055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.936388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.937492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.939193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.939576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.939755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.941733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.942245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.942599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.945433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.947407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.948257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.948754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.949139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.949325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.951269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.953121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.954856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.955252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.957004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.958810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.960703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.962645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.963000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.964626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.966454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.966965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.967463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.967827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.970423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.972270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.974026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.975776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.976300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.976938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.978628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.980341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.982078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.612 [2024-07-15 20:47:32.982461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.985034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.985542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.986513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.988237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.988636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.990501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.991445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.993073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.994407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.994972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.997935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:32.999683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.000948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.002651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.003047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.003665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.005097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.006820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.008562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.008969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.012161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.012671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.013176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.015001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.015362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.017426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.019255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.021144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.023087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.023435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.025515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.027238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.028998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.030743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.031142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.032996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.034730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.036691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.037331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.037769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.040681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.042632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.044227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.045924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.046353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.048419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.048949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.049530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.051248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.051672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.054374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.056158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.056652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.057173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.057521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.059397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.061144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.062389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.063907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.064262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.067531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.069435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.071172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.072789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.073144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.075113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.075625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.076124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.078066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.078415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.081450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.083279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.085105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.085599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.086168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.088022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.089764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.091509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.092805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.093162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.873 [2024-07-15 20:47:33.094819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.096675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.098426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.100173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.100547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.102566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.104493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.106292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.106786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.107341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.109958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.111425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.113148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.113210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.113714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.114343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.116190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.117786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.118720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.119136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.121546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.122968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.123029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.123996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.124352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.126281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.126780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.127286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.128951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.129310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.130414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.132295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.132370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.132863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.133428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.135496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.137447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.139301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.140772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.141219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.142621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.143627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.143689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.145092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.145498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.147086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.148727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.149232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.149731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.150087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.151243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.152886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.152965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.153470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.153978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.156035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.157988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.159783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.159852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.160206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.161510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.162027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.163708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.163771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.164125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.166072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.167508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.168907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.169763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.170200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.172559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.172632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.174256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.176145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.176635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.177269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.177870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.177938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.179326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.179673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.182763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.182835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.184366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.874 [2024-07-15 20:47:33.186209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.464 00:33:41.464 Latency(us) 00:33:41.464 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:41.464 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:41.464 Verification LBA range: start 0x0 length 0x100 00:33:41.464 crypto_ram : 5.79 44.21 2.76 0.00 0.00 2808871.18 74768.03 2465521.31 00:33:41.464 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:41.464 Verification LBA range: start 0x100 length 0x100 00:33:41.464 crypto_ram : 6.03 41.43 2.59 0.00 0.00 2978400.80 141329.81 3165787.71 00:33:41.464 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:41.464 Verification LBA range: start 0x0 length 0x100 00:33:41.464 crypto_ram2 : 5.79 44.20 2.76 0.00 0.00 2707571.76 74312.13 2465521.31 00:33:41.464 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:41.464 Verification LBA range: start 0x100 length 0x100 00:33:41.464 crypto_ram2 : 6.04 42.07 2.63 0.00 0.00 2826060.40 60179.14 3165787.71 00:33:41.464 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:41.464 Verification LBA range: start 0x0 length 0x100 00:33:41.464 crypto_ram3 : 5.60 293.10 18.32 0.00 0.00 390048.37 57215.78 554377.57 00:33:41.464 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:41.464 Verification LBA range: start 0x100 length 0x100 00:33:41.464 crypto_ram3 : 5.67 225.79 14.11 0.00 0.00 495722.95 7237.45 590849.78 00:33:41.464 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:41.464 Verification LBA range: start 0x0 length 0x100 00:33:41.464 crypto_ram4 : 5.69 309.39 19.34 0.00 0.00 358662.21 11682.50 503316.48 00:33:41.464 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:41.464 Verification LBA range: start 0x100 length 0x100 00:33:41.464 crypto_ram4 : 5.80 242.60 15.16 0.00 0.00 449475.94 4758.48 499669.26 00:33:41.464 =================================================================================================================== 00:33:41.464 Total : 1242.78 77.67 0.00 0.00 762049.24 4758.48 3165787.71 00:33:42.031 00:33:42.031 real 0m9.263s 00:33:42.031 user 0m17.564s 00:33:42.031 sys 0m0.465s 00:33:42.031 20:47:34 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:42.031 20:47:34 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:42.031 ************************************ 00:33:42.031 END TEST bdev_verify_big_io 00:33:42.031 ************************************ 00:33:42.031 20:47:34 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:42.031 20:47:34 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:42.031 20:47:34 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:42.031 20:47:34 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:42.031 20:47:34 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:42.031 ************************************ 00:33:42.031 START TEST bdev_write_zeroes 00:33:42.031 ************************************ 00:33:42.031 20:47:34 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:42.031 [2024-07-15 20:47:34.364368] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:33:42.031 [2024-07-15 20:47:34.364432] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1544613 ] 00:33:42.290 [2024-07-15 20:47:34.492823] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:42.290 [2024-07-15 20:47:34.589789] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:42.290 [2024-07-15 20:47:34.611066] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:42.290 [2024-07-15 20:47:34.619094] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:42.290 [2024-07-15 20:47:34.627112] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:42.549 [2024-07-15 20:47:34.728694] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:45.080 [2024-07-15 20:47:36.925339] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:45.080 [2024-07-15 20:47:36.925411] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:45.080 [2024-07-15 20:47:36.925426] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.080 [2024-07-15 20:47:36.933362] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:45.080 [2024-07-15 20:47:36.933381] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:45.080 [2024-07-15 20:47:36.933393] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.080 [2024-07-15 20:47:36.941384] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:45.080 [2024-07-15 20:47:36.941402] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:45.080 [2024-07-15 20:47:36.941413] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.080 [2024-07-15 20:47:36.949403] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:45.080 [2024-07-15 20:47:36.949420] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:45.080 [2024-07-15 20:47:36.949432] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.080 Running I/O for 1 seconds... 00:33:46.014 00:33:46.014 Latency(us) 00:33:46.014 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:46.014 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:46.014 crypto_ram : 1.03 1972.71 7.71 0.00 0.00 64438.31 5442.34 77047.54 00:33:46.014 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:46.014 crypto_ram2 : 1.03 1978.43 7.73 0.00 0.00 63898.93 5413.84 72032.61 00:33:46.014 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:46.014 crypto_ram3 : 1.02 15167.81 59.25 0.00 0.00 8317.16 2464.72 10770.70 00:33:46.014 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:46.014 crypto_ram4 : 1.02 15152.52 59.19 0.00 0.00 8288.70 2464.72 8662.15 00:33:46.014 =================================================================================================================== 00:33:46.014 Total : 34271.48 133.87 0.00 0.00 14770.76 2464.72 77047.54 00:33:46.272 00:33:46.272 real 0m4.195s 00:33:46.272 user 0m3.800s 00:33:46.272 sys 0m0.356s 00:33:46.272 20:47:38 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:46.272 20:47:38 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:46.272 ************************************ 00:33:46.272 END TEST bdev_write_zeroes 00:33:46.272 ************************************ 00:33:46.272 20:47:38 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:46.272 20:47:38 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:46.272 20:47:38 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:46.272 20:47:38 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:46.272 20:47:38 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:46.272 ************************************ 00:33:46.272 START TEST bdev_json_nonenclosed 00:33:46.272 ************************************ 00:33:46.272 20:47:38 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:46.272 [2024-07-15 20:47:38.649834] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:33:46.273 [2024-07-15 20:47:38.649899] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545170 ] 00:33:46.530 [2024-07-15 20:47:38.776285] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:46.530 [2024-07-15 20:47:38.873246] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:46.530 [2024-07-15 20:47:38.873318] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:46.530 [2024-07-15 20:47:38.873339] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:46.530 [2024-07-15 20:47:38.873351] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:46.788 00:33:46.788 real 0m0.386s 00:33:46.788 user 0m0.235s 00:33:46.788 sys 0m0.148s 00:33:46.788 20:47:38 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:46.788 20:47:38 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:46.788 20:47:38 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:46.788 ************************************ 00:33:46.788 END TEST bdev_json_nonenclosed 00:33:46.788 ************************************ 00:33:46.788 20:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:46.788 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:33:46.788 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:46.788 20:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:46.788 20:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:46.788 20:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:46.788 ************************************ 00:33:46.788 START TEST bdev_json_nonarray 00:33:46.788 ************************************ 00:33:46.788 20:47:39 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:46.788 [2024-07-15 20:47:39.122429] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:33:46.788 [2024-07-15 20:47:39.122489] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545340 ] 00:33:47.046 [2024-07-15 20:47:39.248989] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:47.046 [2024-07-15 20:47:39.345203] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:47.046 [2024-07-15 20:47:39.345275] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:47.046 [2024-07-15 20:47:39.345295] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:47.046 [2024-07-15 20:47:39.345307] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:47.303 00:33:47.303 real 0m0.383s 00:33:47.303 user 0m0.230s 00:33:47.303 sys 0m0.151s 00:33:47.303 20:47:39 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:47.303 20:47:39 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:47.304 20:47:39 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:47.304 ************************************ 00:33:47.304 END TEST bdev_json_nonarray 00:33:47.304 ************************************ 00:33:47.304 20:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:33:47.304 20:47:39 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:33:47.304 00:33:47.304 real 1m13.317s 00:33:47.304 user 2m42.775s 00:33:47.304 sys 0m9.393s 00:33:47.304 20:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:47.304 20:47:39 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:47.304 ************************************ 00:33:47.304 END TEST blockdev_crypto_aesni 00:33:47.304 ************************************ 00:33:47.304 20:47:39 -- common/autotest_common.sh@1142 -- # return 0 00:33:47.304 20:47:39 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:47.304 20:47:39 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:47.304 20:47:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:47.304 20:47:39 -- common/autotest_common.sh@10 -- # set +x 00:33:47.304 ************************************ 00:33:47.304 START TEST blockdev_crypto_sw 00:33:47.304 ************************************ 00:33:47.304 20:47:39 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:47.304 * Looking for test storage... 00:33:47.561 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1545406 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1545406 00:33:47.561 20:47:39 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:47.561 20:47:39 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 1545406 ']' 00:33:47.561 20:47:39 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:47.561 20:47:39 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:47.561 20:47:39 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:47.561 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:47.561 20:47:39 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:47.561 20:47:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:47.561 [2024-07-15 20:47:39.763497] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:33:47.561 [2024-07-15 20:47:39.763550] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545406 ] 00:33:47.561 [2024-07-15 20:47:39.875709] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:47.818 [2024-07-15 20:47:39.989385] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:48.075 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:48.075 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:33:48.075 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:33:48.075 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:33:48.075 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:33:48.075 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:48.075 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:48.332 Malloc0 00:33:48.332 Malloc1 00:33:48.332 true 00:33:48.332 true 00:33:48.332 true 00:33:48.332 [2024-07-15 20:47:40.523268] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:48.332 crypto_ram 00:33:48.332 [2024-07-15 20:47:40.531300] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:48.332 crypto_ram2 00:33:48.332 [2024-07-15 20:47:40.539318] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:48.332 crypto_ram3 00:33:48.332 [ 00:33:48.332 { 00:33:48.332 "name": "Malloc1", 00:33:48.332 "aliases": [ 00:33:48.332 "32ae6ac0-3ff4-4334-ae81-2dd15c693af8" 00:33:48.332 ], 00:33:48.332 "product_name": "Malloc disk", 00:33:48.332 "block_size": 4096, 00:33:48.332 "num_blocks": 4096, 00:33:48.332 "uuid": "32ae6ac0-3ff4-4334-ae81-2dd15c693af8", 00:33:48.332 "assigned_rate_limits": { 00:33:48.332 "rw_ios_per_sec": 0, 00:33:48.332 "rw_mbytes_per_sec": 0, 00:33:48.332 "r_mbytes_per_sec": 0, 00:33:48.332 "w_mbytes_per_sec": 0 00:33:48.332 }, 00:33:48.332 "claimed": true, 00:33:48.332 "claim_type": "exclusive_write", 00:33:48.332 "zoned": false, 00:33:48.332 "supported_io_types": { 00:33:48.332 "read": true, 00:33:48.332 "write": true, 00:33:48.332 "unmap": true, 00:33:48.332 "flush": true, 00:33:48.332 "reset": true, 00:33:48.332 "nvme_admin": false, 00:33:48.332 "nvme_io": false, 00:33:48.332 "nvme_io_md": false, 00:33:48.332 "write_zeroes": true, 00:33:48.332 "zcopy": true, 00:33:48.332 "get_zone_info": false, 00:33:48.332 "zone_management": false, 00:33:48.332 "zone_append": false, 00:33:48.332 "compare": false, 00:33:48.332 "compare_and_write": false, 00:33:48.332 "abort": true, 00:33:48.332 "seek_hole": false, 00:33:48.332 "seek_data": false, 00:33:48.332 "copy": true, 00:33:48.332 "nvme_iov_md": false 00:33:48.332 }, 00:33:48.332 "memory_domains": [ 00:33:48.332 { 00:33:48.332 "dma_device_id": "system", 00:33:48.332 "dma_device_type": 1 00:33:48.332 }, 00:33:48.332 { 00:33:48.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:48.332 "dma_device_type": 2 00:33:48.332 } 00:33:48.332 ], 00:33:48.332 "driver_specific": {} 00:33:48.332 } 00:33:48.332 ] 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:48.332 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:48.332 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:33:48.332 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:48.332 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:48.332 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:48.332 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:48.332 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:33:48.333 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:33:48.333 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:48.333 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:33:48.333 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:48.333 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:48.591 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:33:48.591 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:33:48.591 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "da79289d-cfb9-500a-b913-8901c5348b04"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "da79289d-cfb9-500a-b913-8901c5348b04",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ffce3f75-38b1-576a-b216-9b190d539079"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "ffce3f75-38b1-576a-b216-9b190d539079",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:48.591 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:33:48.591 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:33:48.591 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:33:48.591 20:47:40 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 1545406 00:33:48.591 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 1545406 ']' 00:33:48.591 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 1545406 00:33:48.591 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:33:48.591 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:48.591 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1545406 00:33:48.591 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:48.591 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:48.591 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1545406' 00:33:48.591 killing process with pid 1545406 00:33:48.591 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 1545406 00:33:48.591 20:47:40 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 1545406 00:33:48.850 20:47:41 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:48.850 20:47:41 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:48.850 20:47:41 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:48.850 20:47:41 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:48.850 20:47:41 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:49.108 ************************************ 00:33:49.108 START TEST bdev_hello_world 00:33:49.108 ************************************ 00:33:49.108 20:47:41 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:49.108 [2024-07-15 20:47:41.311954] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:33:49.108 [2024-07-15 20:47:41.312018] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545608 ] 00:33:49.108 [2024-07-15 20:47:41.440457] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:49.366 [2024-07-15 20:47:41.549554] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:49.366 [2024-07-15 20:47:41.725797] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:49.366 [2024-07-15 20:47:41.725873] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:49.366 [2024-07-15 20:47:41.725889] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:49.366 [2024-07-15 20:47:41.733813] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:49.366 [2024-07-15 20:47:41.733834] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:49.366 [2024-07-15 20:47:41.733845] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:49.366 [2024-07-15 20:47:41.741835] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:49.366 [2024-07-15 20:47:41.741853] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:49.366 [2024-07-15 20:47:41.741865] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:49.625 [2024-07-15 20:47:41.783696] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:49.625 [2024-07-15 20:47:41.783742] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:49.625 [2024-07-15 20:47:41.783762] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:49.625 [2024-07-15 20:47:41.785101] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:49.625 [2024-07-15 20:47:41.785191] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:49.625 [2024-07-15 20:47:41.785207] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:49.625 [2024-07-15 20:47:41.785242] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:49.625 00:33:49.625 [2024-07-15 20:47:41.785259] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:49.883 00:33:49.883 real 0m0.764s 00:33:49.883 user 0m0.506s 00:33:49.883 sys 0m0.241s 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:49.883 ************************************ 00:33:49.883 END TEST bdev_hello_world 00:33:49.883 ************************************ 00:33:49.883 20:47:42 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:49.883 20:47:42 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:33:49.883 20:47:42 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:49.883 20:47:42 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:49.883 20:47:42 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:49.883 ************************************ 00:33:49.883 START TEST bdev_bounds 00:33:49.883 ************************************ 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1545795 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1545795' 00:33:49.883 Process bdevio pid: 1545795 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1545795 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1545795 ']' 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:49.883 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:49.883 20:47:42 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:49.883 [2024-07-15 20:47:42.162728] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:33:49.883 [2024-07-15 20:47:42.162797] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545795 ] 00:33:50.146 [2024-07-15 20:47:42.294383] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:50.146 [2024-07-15 20:47:42.395465] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:50.146 [2024-07-15 20:47:42.395563] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:50.146 [2024-07-15 20:47:42.395566] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:50.404 [2024-07-15 20:47:42.564408] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:50.404 [2024-07-15 20:47:42.564473] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:50.404 [2024-07-15 20:47:42.564488] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:50.404 [2024-07-15 20:47:42.572427] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:50.404 [2024-07-15 20:47:42.572445] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:50.404 [2024-07-15 20:47:42.572457] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:50.404 [2024-07-15 20:47:42.580451] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:50.404 [2024-07-15 20:47:42.580468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:50.404 [2024-07-15 20:47:42.580480] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:50.971 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:50.971 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:50.971 20:47:43 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:50.971 I/O targets: 00:33:50.971 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:33:50.971 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:33:50.971 00:33:50.971 00:33:50.971 CUnit - A unit testing framework for C - Version 2.1-3 00:33:50.971 http://cunit.sourceforge.net/ 00:33:50.971 00:33:50.971 00:33:50.971 Suite: bdevio tests on: crypto_ram3 00:33:50.971 Test: blockdev write read block ...passed 00:33:50.971 Test: blockdev write zeroes read block ...passed 00:33:50.971 Test: blockdev write zeroes read no split ...passed 00:33:50.971 Test: blockdev write zeroes read split ...passed 00:33:50.971 Test: blockdev write zeroes read split partial ...passed 00:33:50.971 Test: blockdev reset ...passed 00:33:50.971 Test: blockdev write read 8 blocks ...passed 00:33:50.971 Test: blockdev write read size > 128k ...passed 00:33:50.971 Test: blockdev write read invalid size ...passed 00:33:50.971 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:50.971 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:50.971 Test: blockdev write read max offset ...passed 00:33:50.971 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:50.971 Test: blockdev writev readv 8 blocks ...passed 00:33:50.971 Test: blockdev writev readv 30 x 1block ...passed 00:33:50.971 Test: blockdev writev readv block ...passed 00:33:50.971 Test: blockdev writev readv size > 128k ...passed 00:33:50.971 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:50.971 Test: blockdev comparev and writev ...passed 00:33:50.971 Test: blockdev nvme passthru rw ...passed 00:33:50.971 Test: blockdev nvme passthru vendor specific ...passed 00:33:50.971 Test: blockdev nvme admin passthru ...passed 00:33:50.971 Test: blockdev copy ...passed 00:33:50.971 Suite: bdevio tests on: crypto_ram 00:33:50.971 Test: blockdev write read block ...passed 00:33:50.971 Test: blockdev write zeroes read block ...passed 00:33:50.971 Test: blockdev write zeroes read no split ...passed 00:33:50.971 Test: blockdev write zeroes read split ...passed 00:33:50.971 Test: blockdev write zeroes read split partial ...passed 00:33:50.971 Test: blockdev reset ...passed 00:33:50.971 Test: blockdev write read 8 blocks ...passed 00:33:50.971 Test: blockdev write read size > 128k ...passed 00:33:50.971 Test: blockdev write read invalid size ...passed 00:33:50.971 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:50.971 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:50.971 Test: blockdev write read max offset ...passed 00:33:50.971 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:50.971 Test: blockdev writev readv 8 blocks ...passed 00:33:50.971 Test: blockdev writev readv 30 x 1block ...passed 00:33:50.971 Test: blockdev writev readv block ...passed 00:33:50.971 Test: blockdev writev readv size > 128k ...passed 00:33:50.971 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:50.971 Test: blockdev comparev and writev ...passed 00:33:50.971 Test: blockdev nvme passthru rw ...passed 00:33:50.971 Test: blockdev nvme passthru vendor specific ...passed 00:33:50.971 Test: blockdev nvme admin passthru ...passed 00:33:50.971 Test: blockdev copy ...passed 00:33:50.971 00:33:50.971 Run Summary: Type Total Ran Passed Failed Inactive 00:33:50.971 suites 2 2 n/a 0 0 00:33:50.971 tests 46 46 46 0 0 00:33:50.971 asserts 260 260 260 0 n/a 00:33:50.971 00:33:50.971 Elapsed time = 0.194 seconds 00:33:50.971 0 00:33:50.971 20:47:43 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1545795 00:33:50.971 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1545795 ']' 00:33:50.971 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1545795 00:33:50.971 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:50.971 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:50.971 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1545795 00:33:51.229 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:51.229 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:51.229 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1545795' 00:33:51.229 killing process with pid 1545795 00:33:51.229 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1545795 00:33:51.229 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1545795 00:33:51.229 20:47:43 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:33:51.229 00:33:51.229 real 0m1.506s 00:33:51.229 user 0m3.888s 00:33:51.229 sys 0m0.382s 00:33:51.229 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:51.229 20:47:43 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:51.229 ************************************ 00:33:51.229 END TEST bdev_bounds 00:33:51.229 ************************************ 00:33:51.488 20:47:43 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:51.488 20:47:43 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:51.488 20:47:43 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:51.488 20:47:43 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:51.489 20:47:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:51.489 ************************************ 00:33:51.489 START TEST bdev_nbd 00:33:51.489 ************************************ 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1546000 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1546000 /var/tmp/spdk-nbd.sock 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1546000 ']' 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:51.489 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:51.489 20:47:43 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:51.489 [2024-07-15 20:47:43.762738] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:33:51.489 [2024-07-15 20:47:43.762804] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:51.747 [2024-07-15 20:47:43.890539] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:51.747 [2024-07-15 20:47:43.993472] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:52.006 [2024-07-15 20:47:44.164391] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:52.006 [2024-07-15 20:47:44.164457] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:52.006 [2024-07-15 20:47:44.164472] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:52.006 [2024-07-15 20:47:44.172411] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:52.006 [2024-07-15 20:47:44.172432] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:52.006 [2024-07-15 20:47:44.172443] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:52.006 [2024-07-15 20:47:44.180431] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:52.006 [2024-07-15 20:47:44.180450] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:52.006 [2024-07-15 20:47:44.180461] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:52.573 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:52.832 1+0 records in 00:33:52.832 1+0 records out 00:33:52.832 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263508 s, 15.5 MB/s 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:52.832 20:47:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:53.091 1+0 records in 00:33:53.091 1+0 records out 00:33:53.091 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000343253 s, 11.9 MB/s 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:53.091 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:53.350 { 00:33:53.350 "nbd_device": "/dev/nbd0", 00:33:53.350 "bdev_name": "crypto_ram" 00:33:53.350 }, 00:33:53.350 { 00:33:53.350 "nbd_device": "/dev/nbd1", 00:33:53.350 "bdev_name": "crypto_ram3" 00:33:53.350 } 00:33:53.350 ]' 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:53.350 { 00:33:53.350 "nbd_device": "/dev/nbd0", 00:33:53.350 "bdev_name": "crypto_ram" 00:33:53.350 }, 00:33:53.350 { 00:33:53.350 "nbd_device": "/dev/nbd1", 00:33:53.350 "bdev_name": "crypto_ram3" 00:33:53.350 } 00:33:53.350 ]' 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:53.350 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:53.610 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:53.610 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:53.610 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:53.610 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:53.610 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:53.610 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:53.610 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:53.610 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:53.610 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:53.610 20:47:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:53.869 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:54.437 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:54.437 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:54.437 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:54.437 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:54.437 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:54.437 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:54.438 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:54.697 /dev/nbd0 00:33:54.697 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:54.697 20:47:46 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:54.697 20:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:54.697 20:47:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:54.697 1+0 records in 00:33:54.697 1+0 records out 00:33:54.697 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252243 s, 16.2 MB/s 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:54.697 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:33:54.955 /dev/nbd1 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:54.955 1+0 records in 00:33:54.955 1+0 records out 00:33:54.955 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000350542 s, 11.7 MB/s 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:54.955 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:54.956 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:55.214 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:55.214 { 00:33:55.214 "nbd_device": "/dev/nbd0", 00:33:55.214 "bdev_name": "crypto_ram" 00:33:55.214 }, 00:33:55.214 { 00:33:55.214 "nbd_device": "/dev/nbd1", 00:33:55.214 "bdev_name": "crypto_ram3" 00:33:55.214 } 00:33:55.214 ]' 00:33:55.214 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:55.214 { 00:33:55.214 "nbd_device": "/dev/nbd0", 00:33:55.214 "bdev_name": "crypto_ram" 00:33:55.214 }, 00:33:55.214 { 00:33:55.214 "nbd_device": "/dev/nbd1", 00:33:55.214 "bdev_name": "crypto_ram3" 00:33:55.214 } 00:33:55.214 ]' 00:33:55.214 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:55.473 /dev/nbd1' 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:55.473 /dev/nbd1' 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:55.473 256+0 records in 00:33:55.473 256+0 records out 00:33:55.473 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104552 s, 100 MB/s 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:55.473 256+0 records in 00:33:55.473 256+0 records out 00:33:55.473 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0310312 s, 33.8 MB/s 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:55.473 256+0 records in 00:33:55.473 256+0 records out 00:33:55.473 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0466589 s, 22.5 MB/s 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:55.473 20:47:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:55.732 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:55.732 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:55.732 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:55.732 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:55.732 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:55.732 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:55.732 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:55.732 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:55.732 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:55.732 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:55.990 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:56.249 20:47:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:56.816 malloc_lvol_verify 00:33:56.816 20:47:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:57.075 5e32da87-ef9a-4acd-8609-7ed7f9a5d947 00:33:57.075 20:47:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:57.644 bac14f73-36fa-481d-8285-e4a80632a4e8 00:33:57.644 20:47:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:57.903 /dev/nbd0 00:33:57.903 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:57.903 mke2fs 1.46.5 (30-Dec-2021) 00:33:57.903 Discarding device blocks: 0/4096 done 00:33:57.903 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:57.903 00:33:57.903 Allocating group tables: 0/1 done 00:33:57.903 Writing inode tables: 0/1 done 00:33:57.903 Creating journal (1024 blocks): done 00:33:57.903 Writing superblocks and filesystem accounting information: 0/1 done 00:33:57.903 00:33:57.903 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:57.903 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:57.903 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:57.903 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:57.903 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:57.903 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:57.903 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:57.903 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1546000 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1546000 ']' 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1546000 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1546000 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1546000' 00:33:58.161 killing process with pid 1546000 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1546000 00:33:58.161 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1546000 00:33:58.420 20:47:50 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:58.420 00:33:58.420 real 0m7.026s 00:33:58.420 user 0m10.374s 00:33:58.420 sys 0m2.666s 00:33:58.420 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:58.420 20:47:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:58.420 ************************************ 00:33:58.420 END TEST bdev_nbd 00:33:58.420 ************************************ 00:33:58.420 20:47:50 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:58.420 20:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:58.420 20:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:33:58.420 20:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:33:58.420 20:47:50 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:58.420 20:47:50 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:58.420 20:47:50 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:58.420 20:47:50 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:58.680 ************************************ 00:33:58.680 START TEST bdev_fio 00:33:58.680 ************************************ 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:58.680 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:58.680 ************************************ 00:33:58.680 START TEST bdev_fio_rw_verify 00:33:58.680 ************************************ 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:58.680 20:47:50 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:58.680 20:47:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:58.680 20:47:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:58.680 20:47:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:58.680 20:47:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:58.939 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:58.939 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:58.939 fio-3.35 00:33:58.939 Starting 2 threads 00:34:11.159 00:34:11.159 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1547276: Mon Jul 15 20:48:01 2024 00:34:11.159 read: IOPS=19.8k, BW=77.3MiB/s (81.0MB/s)(773MiB/10001msec) 00:34:11.159 slat (usec): min=14, max=175, avg=21.93, stdev= 5.34 00:34:11.159 clat (usec): min=7, max=1901, avg=160.41, stdev=75.07 00:34:11.159 lat (usec): min=23, max=1923, avg=182.35, stdev=78.26 00:34:11.159 clat percentiles (usec): 00:34:11.159 | 50.000th=[ 151], 99.000th=[ 334], 99.900th=[ 355], 99.990th=[ 416], 00:34:11.159 | 99.999th=[ 1876] 00:34:11.159 write: IOPS=23.8k, BW=92.9MiB/s (97.4MB/s)(881MiB/9479msec); 0 zone resets 00:34:11.159 slat (usec): min=14, max=606, avg=37.30, stdev= 8.14 00:34:11.159 clat (usec): min=25, max=1033, avg=216.45, stdev=113.32 00:34:11.159 lat (usec): min=55, max=1078, avg=253.75, stdev=118.19 00:34:11.159 clat percentiles (usec): 00:34:11.159 | 50.000th=[ 204], 99.000th=[ 486], 99.900th=[ 510], 99.990th=[ 660], 00:34:11.159 | 99.999th=[ 938] 00:34:11.159 bw ( KiB/s): min=83688, max=98120, per=94.71%, avg=90117.47, stdev=2044.41, samples=38 00:34:11.159 iops : min=20922, max=24530, avg=22529.37, stdev=511.10, samples=38 00:34:11.159 lat (usec) : 10=0.01%, 20=0.01%, 50=5.24%, 100=13.79%, 250=55.92% 00:34:11.159 lat (usec) : 500=24.90%, 750=0.14%, 1000=0.01% 00:34:11.159 lat (msec) : 2=0.01% 00:34:11.159 cpu : usr=99.55%, sys=0.00%, ctx=28, majf=0, minf=456 00:34:11.159 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:11.159 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:11.159 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:11.159 issued rwts: total=197844,225487,0,0 short=0,0,0,0 dropped=0,0,0,0 00:34:11.159 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:11.159 00:34:11.159 Run status group 0 (all jobs): 00:34:11.159 READ: bw=77.3MiB/s (81.0MB/s), 77.3MiB/s-77.3MiB/s (81.0MB/s-81.0MB/s), io=773MiB (810MB), run=10001-10001msec 00:34:11.159 WRITE: bw=92.9MiB/s (97.4MB/s), 92.9MiB/s-92.9MiB/s (97.4MB/s-97.4MB/s), io=881MiB (924MB), run=9479-9479msec 00:34:11.159 00:34:11.159 real 0m11.111s 00:34:11.159 user 0m23.896s 00:34:11.159 sys 0m0.346s 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:34:11.159 ************************************ 00:34:11.159 END TEST bdev_fio_rw_verify 00:34:11.159 ************************************ 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:34:11.159 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "da79289d-cfb9-500a-b913-8901c5348b04"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "da79289d-cfb9-500a-b913-8901c5348b04",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ffce3f75-38b1-576a-b216-9b190d539079"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "ffce3f75-38b1-576a-b216-9b190d539079",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:34:11.160 crypto_ram3 ]] 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "da79289d-cfb9-500a-b913-8901c5348b04"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "da79289d-cfb9-500a-b913-8901c5348b04",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "ffce3f75-38b1-576a-b216-9b190d539079"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "ffce3f75-38b1-576a-b216-9b190d539079",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:11.160 ************************************ 00:34:11.160 START TEST bdev_fio_trim 00:34:11.160 ************************************ 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:11.160 20:48:02 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:11.160 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:11.160 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:11.160 fio-3.35 00:34:11.160 Starting 2 threads 00:34:21.198 00:34:21.198 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1548909: Mon Jul 15 20:48:13 2024 00:34:21.198 write: IOPS=24.5k, BW=95.6MiB/s (100MB/s)(956MiB/10001msec); 0 zone resets 00:34:21.198 slat (usec): min=14, max=2294, avg=36.18, stdev=12.42 00:34:21.198 clat (usec): min=37, max=1217, avg=264.44, stdev=148.95 00:34:21.198 lat (usec): min=51, max=2630, avg=300.63, stdev=155.94 00:34:21.198 clat percentiles (usec): 00:34:21.198 | 50.000th=[ 243], 99.000th=[ 619], 99.900th=[ 668], 99.990th=[ 898], 00:34:21.198 | 99.999th=[ 1106] 00:34:21.198 bw ( KiB/s): min=80120, max=117376, per=100.00%, avg=98098.11, stdev=9028.57, samples=38 00:34:21.198 iops : min=20030, max=29344, avg=24524.53, stdev=2257.14, samples=38 00:34:21.198 trim: IOPS=24.5k, BW=95.6MiB/s (100MB/s)(956MiB/10001msec); 0 zone resets 00:34:21.198 slat (usec): min=6, max=647, avg=17.50, stdev= 6.38 00:34:21.198 clat (usec): min=44, max=2630, avg=174.83, stdev=69.89 00:34:21.198 lat (usec): min=53, max=2644, avg=192.33, stdev=71.96 00:34:21.198 clat percentiles (usec): 00:34:21.198 | 50.000th=[ 163], 99.000th=[ 351], 99.900th=[ 375], 99.990th=[ 457], 00:34:21.198 | 99.999th=[ 635] 00:34:21.198 bw ( KiB/s): min=80152, max=117376, per=100.00%, avg=98099.79, stdev=9028.10, samples=38 00:34:21.198 iops : min=20038, max=29344, avg=24524.95, stdev=2257.02, samples=38 00:34:21.198 lat (usec) : 50=0.04%, 100=13.70%, 250=53.84%, 500=28.53%, 750=3.88% 00:34:21.198 lat (usec) : 1000=0.01% 00:34:21.198 lat (msec) : 2=0.01%, 4=0.01% 00:34:21.198 cpu : usr=99.53%, sys=0.01%, ctx=60, majf=0, minf=298 00:34:21.198 IO depths : 1=7.2%, 2=17.1%, 4=60.6%, 8=15.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:21.198 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:21.198 complete : 0=0.0%, 4=86.8%, 8=13.2%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:21.198 issued rwts: total=0,244697,244697,0 short=0,0,0,0 dropped=0,0,0,0 00:34:21.198 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:21.198 00:34:21.198 Run status group 0 (all jobs): 00:34:21.198 WRITE: bw=95.6MiB/s (100MB/s), 95.6MiB/s-95.6MiB/s (100MB/s-100MB/s), io=956MiB (1002MB), run=10001-10001msec 00:34:21.198 TRIM: bw=95.6MiB/s (100MB/s), 95.6MiB/s-95.6MiB/s (100MB/s-100MB/s), io=956MiB (1002MB), run=10001-10001msec 00:34:21.198 00:34:21.198 real 0m11.154s 00:34:21.198 user 0m23.903s 00:34:21.199 sys 0m0.376s 00:34:21.199 20:48:13 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:21.199 20:48:13 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:21.199 ************************************ 00:34:21.199 END TEST bdev_fio_trim 00:34:21.199 ************************************ 00:34:21.199 20:48:13 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:21.199 20:48:13 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:34:21.199 20:48:13 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:21.199 20:48:13 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:34:21.199 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:21.199 20:48:13 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:34:21.199 00:34:21.199 real 0m22.629s 00:34:21.199 user 0m47.990s 00:34:21.199 sys 0m0.918s 00:34:21.199 20:48:13 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:21.199 20:48:13 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:21.199 ************************************ 00:34:21.199 END TEST bdev_fio 00:34:21.199 ************************************ 00:34:21.199 20:48:13 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:21.199 20:48:13 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:21.199 20:48:13 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:21.199 20:48:13 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:21.199 20:48:13 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:21.199 20:48:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:21.199 ************************************ 00:34:21.199 START TEST bdev_verify 00:34:21.199 ************************************ 00:34:21.199 20:48:13 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:21.456 [2024-07-15 20:48:13.580815] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:34:21.456 [2024-07-15 20:48:13.580881] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1550626 ] 00:34:21.456 [2024-07-15 20:48:13.709332] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:21.456 [2024-07-15 20:48:13.811038] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:21.456 [2024-07-15 20:48:13.811044] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:21.715 [2024-07-15 20:48:13.984718] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:21.715 [2024-07-15 20:48:13.984783] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:21.715 [2024-07-15 20:48:13.984799] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:21.715 [2024-07-15 20:48:13.992738] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:21.715 [2024-07-15 20:48:13.992757] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:21.715 [2024-07-15 20:48:13.992768] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:21.715 [2024-07-15 20:48:14.000762] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:21.715 [2024-07-15 20:48:14.000780] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:21.715 [2024-07-15 20:48:14.000791] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:21.715 Running I/O for 5 seconds... 00:34:26.984 00:34:26.984 Latency(us) 00:34:26.984 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:26.984 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:26.984 Verification LBA range: start 0x0 length 0x800 00:34:26.984 crypto_ram : 5.02 5994.02 23.41 0.00 0.00 21272.12 2051.56 23478.98 00:34:26.984 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:26.984 Verification LBA range: start 0x800 length 0x800 00:34:26.984 crypto_ram : 5.02 4819.92 18.83 0.00 0.00 26446.58 2336.50 26898.25 00:34:26.984 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:26.984 Verification LBA range: start 0x0 length 0x800 00:34:26.984 crypto_ram3 : 5.03 3004.88 11.74 0.00 0.00 42371.75 2080.06 28379.94 00:34:26.984 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:26.984 Verification LBA range: start 0x800 length 0x800 00:34:26.984 crypto_ram3 : 5.03 2418.05 9.45 0.00 0.00 52606.72 2137.04 33736.79 00:34:26.984 =================================================================================================================== 00:34:26.984 Total : 16236.87 63.43 0.00 0.00 31389.10 2051.56 33736.79 00:34:26.984 00:34:26.984 real 0m5.817s 00:34:26.984 user 0m10.930s 00:34:26.984 sys 0m0.240s 00:34:26.984 20:48:19 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:26.984 20:48:19 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:34:26.984 ************************************ 00:34:26.984 END TEST bdev_verify 00:34:26.984 ************************************ 00:34:27.243 20:48:19 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:27.243 20:48:19 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:27.243 20:48:19 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:27.243 20:48:19 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:27.243 20:48:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:27.243 ************************************ 00:34:27.243 START TEST bdev_verify_big_io 00:34:27.243 ************************************ 00:34:27.243 20:48:19 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:27.243 [2024-07-15 20:48:19.476762] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:34:27.243 [2024-07-15 20:48:19.476823] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1551422 ] 00:34:27.243 [2024-07-15 20:48:19.604895] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:27.502 [2024-07-15 20:48:19.703738] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:27.502 [2024-07-15 20:48:19.703743] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:27.502 [2024-07-15 20:48:19.867691] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:27.502 [2024-07-15 20:48:19.867762] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:27.502 [2024-07-15 20:48:19.867777] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:27.502 [2024-07-15 20:48:19.875712] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:27.502 [2024-07-15 20:48:19.875732] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:27.502 [2024-07-15 20:48:19.875743] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:27.760 [2024-07-15 20:48:19.883735] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:27.760 [2024-07-15 20:48:19.883752] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:27.760 [2024-07-15 20:48:19.883764] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:27.760 Running I/O for 5 seconds... 00:34:33.029 00:34:33.029 Latency(us) 00:34:33.029 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:33.029 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:33.029 Verification LBA range: start 0x0 length 0x80 00:34:33.029 crypto_ram : 5.23 440.28 27.52 0.00 0.00 283470.33 6040.71 415783.18 00:34:33.029 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:33.029 Verification LBA range: start 0x80 length 0x80 00:34:33.029 crypto_ram : 5.30 362.03 22.63 0.00 0.00 343500.62 7465.41 465020.66 00:34:33.029 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:33.029 Verification LBA range: start 0x0 length 0x80 00:34:33.029 crypto_ram3 : 5.24 219.99 13.75 0.00 0.00 547789.34 19603.81 430372.06 00:34:33.029 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:33.029 Verification LBA range: start 0x80 length 0x80 00:34:33.029 crypto_ram3 : 5.32 192.48 12.03 0.00 0.00 617574.82 6924.02 485080.38 00:34:33.029 =================================================================================================================== 00:34:33.029 Total : 1214.78 75.92 0.00 0.00 402513.56 6040.71 485080.38 00:34:33.288 00:34:33.288 real 0m6.097s 00:34:33.288 user 0m11.506s 00:34:33.288 sys 0m0.223s 00:34:33.288 20:48:25 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:33.288 20:48:25 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:34:33.288 ************************************ 00:34:33.288 END TEST bdev_verify_big_io 00:34:33.288 ************************************ 00:34:33.288 20:48:25 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:33.288 20:48:25 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:33.288 20:48:25 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:33.288 20:48:25 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:33.288 20:48:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:33.288 ************************************ 00:34:33.288 START TEST bdev_write_zeroes 00:34:33.288 ************************************ 00:34:33.288 20:48:25 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:33.548 [2024-07-15 20:48:25.708141] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:34:33.548 [2024-07-15 20:48:25.708271] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1552141 ] 00:34:33.548 [2024-07-15 20:48:25.905061] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:33.808 [2024-07-15 20:48:26.016922] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:34.067 [2024-07-15 20:48:26.198494] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:34.067 [2024-07-15 20:48:26.198559] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:34.067 [2024-07-15 20:48:26.198575] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:34.067 [2024-07-15 20:48:26.206512] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:34.067 [2024-07-15 20:48:26.206533] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:34.067 [2024-07-15 20:48:26.206545] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:34.067 [2024-07-15 20:48:26.214533] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:34.067 [2024-07-15 20:48:26.214551] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:34.067 [2024-07-15 20:48:26.214564] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:34.067 Running I/O for 1 seconds... 00:34:35.004 00:34:35.004 Latency(us) 00:34:35.004 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:35.004 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:35.004 crypto_ram : 1.01 26644.36 104.08 0.00 0.00 4790.67 1282.23 6582.09 00:34:35.004 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:35.004 crypto_ram3 : 1.01 13295.28 51.93 0.00 0.00 9557.98 5955.23 9858.89 00:34:35.004 =================================================================================================================== 00:34:35.004 Total : 39939.64 156.01 0.00 0.00 6379.77 1282.23 9858.89 00:34:35.264 00:34:35.264 real 0m1.890s 00:34:35.264 user 0m1.546s 00:34:35.264 sys 0m0.318s 00:34:35.264 20:48:27 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:35.264 20:48:27 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:34:35.264 ************************************ 00:34:35.264 END TEST bdev_write_zeroes 00:34:35.264 ************************************ 00:34:35.264 20:48:27 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:35.264 20:48:27 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:35.264 20:48:27 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:35.264 20:48:27 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:35.264 20:48:27 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:35.264 ************************************ 00:34:35.264 START TEST bdev_json_nonenclosed 00:34:35.264 ************************************ 00:34:35.264 20:48:27 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:35.264 [2024-07-15 20:48:27.637378] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:34:35.264 [2024-07-15 20:48:27.637443] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1552439 ] 00:34:35.523 [2024-07-15 20:48:27.765846] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:35.523 [2024-07-15 20:48:27.866272] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:35.523 [2024-07-15 20:48:27.866342] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:35.523 [2024-07-15 20:48:27.866362] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:35.523 [2024-07-15 20:48:27.866374] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:35.791 00:34:35.791 real 0m0.395s 00:34:35.791 user 0m0.243s 00:34:35.791 sys 0m0.150s 00:34:35.791 20:48:27 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:34:35.791 20:48:27 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:35.791 20:48:27 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:35.791 ************************************ 00:34:35.791 END TEST bdev_json_nonenclosed 00:34:35.791 ************************************ 00:34:35.791 20:48:28 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:34:35.791 20:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:34:35.791 20:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:35.791 20:48:28 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:35.791 20:48:28 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:35.791 20:48:28 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:35.791 ************************************ 00:34:35.791 START TEST bdev_json_nonarray 00:34:35.791 ************************************ 00:34:35.791 20:48:28 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:35.791 [2024-07-15 20:48:28.118362] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:34:35.791 [2024-07-15 20:48:28.118422] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1552528 ] 00:34:36.050 [2024-07-15 20:48:28.246976] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:36.051 [2024-07-15 20:48:28.347231] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:36.051 [2024-07-15 20:48:28.347310] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:36.051 [2024-07-15 20:48:28.347331] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:36.051 [2024-07-15 20:48:28.347344] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:36.310 00:34:36.310 real 0m0.394s 00:34:36.310 user 0m0.245s 00:34:36.310 sys 0m0.146s 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:36.310 ************************************ 00:34:36.310 END TEST bdev_json_nonarray 00:34:36.310 ************************************ 00:34:36.310 20:48:28 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:34:36.310 20:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:34:36.310 20:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:34:36.310 20:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:34:36.310 20:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:34:36.310 20:48:28 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:34:36.310 20:48:28 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:34:36.310 20:48:28 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:36.310 20:48:28 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:36.310 ************************************ 00:34:36.310 START TEST bdev_crypto_enomem 00:34:36.310 ************************************ 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=1552551 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 1552551 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 1552551 ']' 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:36.310 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:36.310 20:48:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:36.310 [2024-07-15 20:48:28.604346] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:34:36.310 [2024-07-15 20:48:28.604414] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1552551 ] 00:34:36.570 [2024-07-15 20:48:28.739120] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:36.570 [2024-07-15 20:48:28.854379] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:37.508 true 00:34:37.508 base0 00:34:37.508 true 00:34:37.508 [2024-07-15 20:48:29.574100] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:37.508 crypt0 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:34:37.508 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:37.509 [ 00:34:37.509 { 00:34:37.509 "name": "crypt0", 00:34:37.509 "aliases": [ 00:34:37.509 "a304fc29-0e26-5a50-b505-708733c5547a" 00:34:37.509 ], 00:34:37.509 "product_name": "crypto", 00:34:37.509 "block_size": 512, 00:34:37.509 "num_blocks": 2097152, 00:34:37.509 "uuid": "a304fc29-0e26-5a50-b505-708733c5547a", 00:34:37.509 "assigned_rate_limits": { 00:34:37.509 "rw_ios_per_sec": 0, 00:34:37.509 "rw_mbytes_per_sec": 0, 00:34:37.509 "r_mbytes_per_sec": 0, 00:34:37.509 "w_mbytes_per_sec": 0 00:34:37.509 }, 00:34:37.509 "claimed": false, 00:34:37.509 "zoned": false, 00:34:37.509 "supported_io_types": { 00:34:37.509 "read": true, 00:34:37.509 "write": true, 00:34:37.509 "unmap": false, 00:34:37.509 "flush": false, 00:34:37.509 "reset": true, 00:34:37.509 "nvme_admin": false, 00:34:37.509 "nvme_io": false, 00:34:37.509 "nvme_io_md": false, 00:34:37.509 "write_zeroes": true, 00:34:37.509 "zcopy": false, 00:34:37.509 "get_zone_info": false, 00:34:37.509 "zone_management": false, 00:34:37.509 "zone_append": false, 00:34:37.509 "compare": false, 00:34:37.509 "compare_and_write": false, 00:34:37.509 "abort": false, 00:34:37.509 "seek_hole": false, 00:34:37.509 "seek_data": false, 00:34:37.509 "copy": false, 00:34:37.509 "nvme_iov_md": false 00:34:37.509 }, 00:34:37.509 "memory_domains": [ 00:34:37.509 { 00:34:37.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:37.509 "dma_device_type": 2 00:34:37.509 } 00:34:37.509 ], 00:34:37.509 "driver_specific": { 00:34:37.509 "crypto": { 00:34:37.509 "base_bdev_name": "EE_base0", 00:34:37.509 "name": "crypt0", 00:34:37.509 "key_name": "test_dek_sw" 00:34:37.509 } 00:34:37.509 } 00:34:37.509 } 00:34:37.509 ] 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=1552728 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:34:37.509 20:48:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:37.509 Running I/O for 5 seconds... 00:34:38.446 20:48:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:34:38.446 20:48:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:38.446 20:48:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:38.446 20:48:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:38.446 20:48:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 1552728 00:34:42.632 00:34:42.632 Latency(us) 00:34:42.632 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:42.632 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:34:42.632 crypt0 : 5.00 28286.92 110.50 0.00 0.00 1126.67 527.14 1503.05 00:34:42.632 =================================================================================================================== 00:34:42.632 Total : 28286.92 110.50 0.00 0.00 1126.67 527.14 1503.05 00:34:42.632 0 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 1552551 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 1552551 ']' 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 1552551 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1552551 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1552551' 00:34:42.632 killing process with pid 1552551 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 1552551 00:34:42.632 Received shutdown signal, test time was about 5.000000 seconds 00:34:42.632 00:34:42.632 Latency(us) 00:34:42.632 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:42.632 =================================================================================================================== 00:34:42.632 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:42.632 20:48:34 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 1552551 00:34:42.917 20:48:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:34:42.917 00:34:42.917 real 0m6.568s 00:34:42.917 user 0m6.807s 00:34:42.917 sys 0m0.422s 00:34:42.917 20:48:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:42.917 20:48:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:42.917 ************************************ 00:34:42.917 END TEST bdev_crypto_enomem 00:34:42.917 ************************************ 00:34:42.917 20:48:35 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:42.917 20:48:35 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:34:42.917 20:48:35 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:34:42.917 20:48:35 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:42.917 20:48:35 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:42.917 20:48:35 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:34:42.917 20:48:35 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:34:42.917 20:48:35 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:34:42.917 20:48:35 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:34:42.917 00:34:42.917 real 0m55.586s 00:34:42.917 user 1m36.140s 00:34:42.917 sys 0m6.939s 00:34:42.917 20:48:35 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:42.917 20:48:35 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:42.917 ************************************ 00:34:42.917 END TEST blockdev_crypto_sw 00:34:42.917 ************************************ 00:34:42.917 20:48:35 -- common/autotest_common.sh@1142 -- # return 0 00:34:42.917 20:48:35 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:42.917 20:48:35 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:42.917 20:48:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:42.917 20:48:35 -- common/autotest_common.sh@10 -- # set +x 00:34:42.917 ************************************ 00:34:42.917 START TEST blockdev_crypto_qat 00:34:42.917 ************************************ 00:34:42.917 20:48:35 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:43.180 * Looking for test storage... 00:34:43.180 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1553492 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:43.180 20:48:35 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1553492 00:34:43.180 20:48:35 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 1553492 ']' 00:34:43.180 20:48:35 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:43.180 20:48:35 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:43.180 20:48:35 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:43.180 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:43.180 20:48:35 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:43.180 20:48:35 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:43.180 [2024-07-15 20:48:35.445531] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:34:43.180 [2024-07-15 20:48:35.445605] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1553492 ] 00:34:43.439 [2024-07-15 20:48:35.564717] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:43.439 [2024-07-15 20:48:35.668295] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:44.376 20:48:36 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:44.376 20:48:36 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:34:44.376 20:48:36 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:34:44.376 20:48:36 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:34:44.376 20:48:36 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:34:44.376 20:48:36 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:44.376 20:48:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:44.376 [2024-07-15 20:48:36.430672] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:44.376 [2024-07-15 20:48:36.438706] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:44.376 [2024-07-15 20:48:36.446723] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:44.376 [2024-07-15 20:48:36.514871] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:46.911 true 00:34:46.911 true 00:34:46.911 true 00:34:46.911 true 00:34:46.911 Malloc0 00:34:46.911 Malloc1 00:34:46.911 Malloc2 00:34:46.911 Malloc3 00:34:46.911 [2024-07-15 20:48:38.879107] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:46.911 crypto_ram 00:34:46.911 [2024-07-15 20:48:38.887123] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:46.911 crypto_ram1 00:34:46.911 [2024-07-15 20:48:38.895146] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:46.911 crypto_ram2 00:34:46.911 [2024-07-15 20:48:38.903167] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:46.911 crypto_ram3 00:34:46.911 [ 00:34:46.911 { 00:34:46.911 "name": "Malloc1", 00:34:46.911 "aliases": [ 00:34:46.911 "3e07ee0b-5670-4201-a5d7-1a3f6437bb56" 00:34:46.911 ], 00:34:46.911 "product_name": "Malloc disk", 00:34:46.911 "block_size": 512, 00:34:46.911 "num_blocks": 65536, 00:34:46.911 "uuid": "3e07ee0b-5670-4201-a5d7-1a3f6437bb56", 00:34:46.911 "assigned_rate_limits": { 00:34:46.911 "rw_ios_per_sec": 0, 00:34:46.911 "rw_mbytes_per_sec": 0, 00:34:46.911 "r_mbytes_per_sec": 0, 00:34:46.911 "w_mbytes_per_sec": 0 00:34:46.911 }, 00:34:46.911 "claimed": true, 00:34:46.911 "claim_type": "exclusive_write", 00:34:46.911 "zoned": false, 00:34:46.911 "supported_io_types": { 00:34:46.911 "read": true, 00:34:46.911 "write": true, 00:34:46.911 "unmap": true, 00:34:46.911 "flush": true, 00:34:46.911 "reset": true, 00:34:46.911 "nvme_admin": false, 00:34:46.911 "nvme_io": false, 00:34:46.911 "nvme_io_md": false, 00:34:46.911 "write_zeroes": true, 00:34:46.911 "zcopy": true, 00:34:46.911 "get_zone_info": false, 00:34:46.911 "zone_management": false, 00:34:46.911 "zone_append": false, 00:34:46.911 "compare": false, 00:34:46.911 "compare_and_write": false, 00:34:46.911 "abort": true, 00:34:46.911 "seek_hole": false, 00:34:46.911 "seek_data": false, 00:34:46.911 "copy": true, 00:34:46.911 "nvme_iov_md": false 00:34:46.911 }, 00:34:46.911 "memory_domains": [ 00:34:46.911 { 00:34:46.911 "dma_device_id": "system", 00:34:46.911 "dma_device_type": 1 00:34:46.911 }, 00:34:46.911 { 00:34:46.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:46.911 "dma_device_type": 2 00:34:46.911 } 00:34:46.911 ], 00:34:46.911 "driver_specific": {} 00:34:46.911 } 00:34:46.911 ] 00:34:46.911 20:48:38 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:46.911 20:48:38 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:34:46.911 20:48:38 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:46.911 20:48:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:46.911 20:48:38 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:46.911 20:48:38 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:34:46.911 20:48:38 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:34:46.911 20:48:38 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:46.911 20:48:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:46.911 20:48:38 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:46.911 20:48:38 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:34:46.911 20:48:38 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:46.911 20:48:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:46.911 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:46.911 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:34:46.911 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:46.912 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:34:46.912 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:34:46.912 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:46.912 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:34:46.912 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:34:46.912 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "de9fc9d0-7b57-561a-8902-a67803b920fd"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "de9fc9d0-7b57-561a-8902-a67803b920fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "1efc1b38-69f7-55a3-a894-817e962a3265"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1efc1b38-69f7-55a3-a894-817e962a3265",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "bb6d3ec8-8672-5575-9b5f-54fc01e0d7a9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "bb6d3ec8-8672-5575-9b5f-54fc01e0d7a9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "29dc53a1-e973-55d5-bb90-89ea620765b8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "29dc53a1-e973-55d5-bb90-89ea620765b8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:46.912 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:34:46.912 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:34:46.912 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:34:46.912 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 1553492 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 1553492 ']' 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 1553492 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1553492 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1553492' 00:34:46.912 killing process with pid 1553492 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 1553492 00:34:46.912 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 1553492 00:34:47.480 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:47.480 20:48:39 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:47.480 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:34:47.480 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:47.481 20:48:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:47.481 ************************************ 00:34:47.481 START TEST bdev_hello_world 00:34:47.481 ************************************ 00:34:47.481 20:48:39 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:47.739 [2024-07-15 20:48:39.880775] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:34:47.739 [2024-07-15 20:48:39.880834] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1554036 ] 00:34:47.739 [2024-07-15 20:48:40.007350] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:47.996 [2024-07-15 20:48:40.119450] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:47.996 [2024-07-15 20:48:40.140755] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:47.996 [2024-07-15 20:48:40.148784] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:47.996 [2024-07-15 20:48:40.156809] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:47.996 [2024-07-15 20:48:40.268774] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:50.529 [2024-07-15 20:48:42.484842] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:50.529 [2024-07-15 20:48:42.484911] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:50.529 [2024-07-15 20:48:42.484931] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:50.529 [2024-07-15 20:48:42.492859] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:50.529 [2024-07-15 20:48:42.492878] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:50.529 [2024-07-15 20:48:42.492890] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:50.529 [2024-07-15 20:48:42.500881] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:50.529 [2024-07-15 20:48:42.500898] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:50.529 [2024-07-15 20:48:42.500909] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:50.529 [2024-07-15 20:48:42.508902] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:50.529 [2024-07-15 20:48:42.508919] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:50.529 [2024-07-15 20:48:42.508934] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:50.529 [2024-07-15 20:48:42.586432] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:50.529 [2024-07-15 20:48:42.586477] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:50.529 [2024-07-15 20:48:42.586497] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:50.529 [2024-07-15 20:48:42.587823] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:50.529 [2024-07-15 20:48:42.587895] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:50.529 [2024-07-15 20:48:42.587911] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:50.529 [2024-07-15 20:48:42.587967] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:50.529 00:34:50.529 [2024-07-15 20:48:42.587986] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:50.787 00:34:50.787 real 0m3.160s 00:34:50.787 user 0m2.741s 00:34:50.787 sys 0m0.382s 00:34:50.787 20:48:42 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:50.787 20:48:42 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:50.787 ************************************ 00:34:50.787 END TEST bdev_hello_world 00:34:50.787 ************************************ 00:34:50.787 20:48:43 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:50.787 20:48:43 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:34:50.787 20:48:43 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:50.787 20:48:43 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:50.787 20:48:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:50.787 ************************************ 00:34:50.787 START TEST bdev_bounds 00:34:50.787 ************************************ 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1554543 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1554543' 00:34:50.787 Process bdevio pid: 1554543 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1554543 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1554543 ']' 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:50.787 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:50.787 20:48:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:50.787 [2024-07-15 20:48:43.129574] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:34:50.787 [2024-07-15 20:48:43.129644] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1554543 ] 00:34:51.045 [2024-07-15 20:48:43.258798] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:51.045 [2024-07-15 20:48:43.363035] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:51.045 [2024-07-15 20:48:43.363136] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:51.045 [2024-07-15 20:48:43.363139] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:51.045 [2024-07-15 20:48:43.384555] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:51.045 [2024-07-15 20:48:43.392579] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:51.045 [2024-07-15 20:48:43.400598] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:51.302 [2024-07-15 20:48:43.501779] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:53.833 [2024-07-15 20:48:45.716513] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:53.833 [2024-07-15 20:48:45.716592] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:53.833 [2024-07-15 20:48:45.716608] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:53.833 [2024-07-15 20:48:45.724529] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:53.833 [2024-07-15 20:48:45.724553] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:53.833 [2024-07-15 20:48:45.724566] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:53.833 [2024-07-15 20:48:45.732555] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:53.833 [2024-07-15 20:48:45.732573] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:53.833 [2024-07-15 20:48:45.732584] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:53.833 [2024-07-15 20:48:45.740575] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:53.833 [2024-07-15 20:48:45.740592] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:53.833 [2024-07-15 20:48:45.740604] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:53.833 20:48:45 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:53.833 20:48:45 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:34:53.833 20:48:45 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:53.833 I/O targets: 00:34:53.833 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:34:53.833 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:34:53.833 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:34:53.833 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:34:53.833 00:34:53.833 00:34:53.833 CUnit - A unit testing framework for C - Version 2.1-3 00:34:53.833 http://cunit.sourceforge.net/ 00:34:53.833 00:34:53.833 00:34:53.833 Suite: bdevio tests on: crypto_ram3 00:34:53.833 Test: blockdev write read block ...passed 00:34:53.833 Test: blockdev write zeroes read block ...passed 00:34:53.833 Test: blockdev write zeroes read no split ...passed 00:34:53.833 Test: blockdev write zeroes read split ...passed 00:34:53.833 Test: blockdev write zeroes read split partial ...passed 00:34:53.833 Test: blockdev reset ...passed 00:34:53.833 Test: blockdev write read 8 blocks ...passed 00:34:53.833 Test: blockdev write read size > 128k ...passed 00:34:53.833 Test: blockdev write read invalid size ...passed 00:34:53.833 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:53.833 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:53.833 Test: blockdev write read max offset ...passed 00:34:53.833 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:53.833 Test: blockdev writev readv 8 blocks ...passed 00:34:53.833 Test: blockdev writev readv 30 x 1block ...passed 00:34:53.833 Test: blockdev writev readv block ...passed 00:34:53.833 Test: blockdev writev readv size > 128k ...passed 00:34:53.833 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:53.833 Test: blockdev comparev and writev ...passed 00:34:53.833 Test: blockdev nvme passthru rw ...passed 00:34:53.833 Test: blockdev nvme passthru vendor specific ...passed 00:34:53.833 Test: blockdev nvme admin passthru ...passed 00:34:53.833 Test: blockdev copy ...passed 00:34:53.833 Suite: bdevio tests on: crypto_ram2 00:34:53.833 Test: blockdev write read block ...passed 00:34:53.833 Test: blockdev write zeroes read block ...passed 00:34:53.833 Test: blockdev write zeroes read no split ...passed 00:34:53.833 Test: blockdev write zeroes read split ...passed 00:34:53.833 Test: blockdev write zeroes read split partial ...passed 00:34:53.833 Test: blockdev reset ...passed 00:34:53.833 Test: blockdev write read 8 blocks ...passed 00:34:53.833 Test: blockdev write read size > 128k ...passed 00:34:53.833 Test: blockdev write read invalid size ...passed 00:34:53.833 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:53.833 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:53.833 Test: blockdev write read max offset ...passed 00:34:53.833 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:53.833 Test: blockdev writev readv 8 blocks ...passed 00:34:53.833 Test: blockdev writev readv 30 x 1block ...passed 00:34:53.833 Test: blockdev writev readv block ...passed 00:34:53.833 Test: blockdev writev readv size > 128k ...passed 00:34:53.833 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:53.833 Test: blockdev comparev and writev ...passed 00:34:53.833 Test: blockdev nvme passthru rw ...passed 00:34:53.833 Test: blockdev nvme passthru vendor specific ...passed 00:34:53.833 Test: blockdev nvme admin passthru ...passed 00:34:53.833 Test: blockdev copy ...passed 00:34:53.833 Suite: bdevio tests on: crypto_ram1 00:34:53.833 Test: blockdev write read block ...passed 00:34:53.833 Test: blockdev write zeroes read block ...passed 00:34:53.833 Test: blockdev write zeroes read no split ...passed 00:34:54.092 Test: blockdev write zeroes read split ...passed 00:34:54.092 Test: blockdev write zeroes read split partial ...passed 00:34:54.092 Test: blockdev reset ...passed 00:34:54.092 Test: blockdev write read 8 blocks ...passed 00:34:54.092 Test: blockdev write read size > 128k ...passed 00:34:54.092 Test: blockdev write read invalid size ...passed 00:34:54.092 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:54.092 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:54.092 Test: blockdev write read max offset ...passed 00:34:54.092 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:54.092 Test: blockdev writev readv 8 blocks ...passed 00:34:54.092 Test: blockdev writev readv 30 x 1block ...passed 00:34:54.092 Test: blockdev writev readv block ...passed 00:34:54.092 Test: blockdev writev readv size > 128k ...passed 00:34:54.092 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:54.092 Test: blockdev comparev and writev ...passed 00:34:54.092 Test: blockdev nvme passthru rw ...passed 00:34:54.092 Test: blockdev nvme passthru vendor specific ...passed 00:34:54.092 Test: blockdev nvme admin passthru ...passed 00:34:54.092 Test: blockdev copy ...passed 00:34:54.092 Suite: bdevio tests on: crypto_ram 00:34:54.092 Test: blockdev write read block ...passed 00:34:54.092 Test: blockdev write zeroes read block ...passed 00:34:54.092 Test: blockdev write zeroes read no split ...passed 00:34:54.351 Test: blockdev write zeroes read split ...passed 00:34:54.351 Test: blockdev write zeroes read split partial ...passed 00:34:54.351 Test: blockdev reset ...passed 00:34:54.351 Test: blockdev write read 8 blocks ...passed 00:34:54.351 Test: blockdev write read size > 128k ...passed 00:34:54.351 Test: blockdev write read invalid size ...passed 00:34:54.351 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:54.351 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:54.351 Test: blockdev write read max offset ...passed 00:34:54.351 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:54.351 Test: blockdev writev readv 8 blocks ...passed 00:34:54.351 Test: blockdev writev readv 30 x 1block ...passed 00:34:54.351 Test: blockdev writev readv block ...passed 00:34:54.609 Test: blockdev writev readv size > 128k ...passed 00:34:54.609 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:54.609 Test: blockdev comparev and writev ...passed 00:34:54.609 Test: blockdev nvme passthru rw ...passed 00:34:54.609 Test: blockdev nvme passthru vendor specific ...passed 00:34:54.609 Test: blockdev nvme admin passthru ...passed 00:34:54.609 Test: blockdev copy ...passed 00:34:54.609 00:34:54.609 Run Summary: Type Total Ran Passed Failed Inactive 00:34:54.609 suites 4 4 n/a 0 0 00:34:54.609 tests 92 92 92 0 0 00:34:54.609 asserts 520 520 520 0 n/a 00:34:54.609 00:34:54.609 Elapsed time = 1.569 seconds 00:34:54.609 0 00:34:54.609 20:48:46 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1554543 00:34:54.609 20:48:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1554543 ']' 00:34:54.609 20:48:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1554543 00:34:54.609 20:48:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:34:54.609 20:48:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:54.609 20:48:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1554543 00:34:54.609 20:48:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:54.610 20:48:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:54.610 20:48:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1554543' 00:34:54.610 killing process with pid 1554543 00:34:54.610 20:48:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1554543 00:34:54.610 20:48:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1554543 00:34:54.869 20:48:47 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:34:54.869 00:34:54.869 real 0m4.159s 00:34:54.869 user 0m11.165s 00:34:54.869 sys 0m0.571s 00:34:54.869 20:48:47 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:54.869 20:48:47 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:54.869 ************************************ 00:34:54.869 END TEST bdev_bounds 00:34:54.869 ************************************ 00:34:55.127 20:48:47 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:55.127 20:48:47 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:55.127 20:48:47 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:34:55.127 20:48:47 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:55.127 20:48:47 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:55.127 ************************************ 00:34:55.127 START TEST bdev_nbd 00:34:55.127 ************************************ 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1555116 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1555116 /var/tmp/spdk-nbd.sock 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1555116 ']' 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:55.127 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:55.127 20:48:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:55.127 [2024-07-15 20:48:47.389718] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:34:55.128 [2024-07-15 20:48:47.389794] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:55.386 [2024-07-15 20:48:47.520617] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:55.386 [2024-07-15 20:48:47.623781] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:55.386 [2024-07-15 20:48:47.645070] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:55.386 [2024-07-15 20:48:47.653091] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:55.386 [2024-07-15 20:48:47.661109] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:55.643 [2024-07-15 20:48:47.766628] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:58.172 [2024-07-15 20:48:49.979389] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:58.172 [2024-07-15 20:48:49.979443] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:58.172 [2024-07-15 20:48:49.979458] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:58.172 [2024-07-15 20:48:49.987408] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:58.172 [2024-07-15 20:48:49.987429] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:58.172 [2024-07-15 20:48:49.987442] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:58.172 [2024-07-15 20:48:49.995430] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:58.172 [2024-07-15 20:48:49.995448] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:58.172 [2024-07-15 20:48:49.995460] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:58.172 [2024-07-15 20:48:50.003451] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:58.172 [2024-07-15 20:48:50.003469] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:58.172 [2024-07-15 20:48:50.003481] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:58.172 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:58.172 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:34:58.172 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:58.172 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:58.173 1+0 records in 00:34:58.173 1+0 records out 00:34:58.173 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306545 s, 13.4 MB/s 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:58.173 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:58.431 1+0 records in 00:34:58.431 1+0 records out 00:34:58.431 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312601 s, 13.1 MB/s 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:58.431 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:58.689 1+0 records in 00:34:58.689 1+0 records out 00:34:58.689 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000334248 s, 12.3 MB/s 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:58.689 20:48:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:58.946 1+0 records in 00:34:58.946 1+0 records out 00:34:58.946 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000344196 s, 11.9 MB/s 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:58.946 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:59.204 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:59.204 { 00:34:59.204 "nbd_device": "/dev/nbd0", 00:34:59.204 "bdev_name": "crypto_ram" 00:34:59.204 }, 00:34:59.204 { 00:34:59.204 "nbd_device": "/dev/nbd1", 00:34:59.204 "bdev_name": "crypto_ram1" 00:34:59.204 }, 00:34:59.204 { 00:34:59.204 "nbd_device": "/dev/nbd2", 00:34:59.204 "bdev_name": "crypto_ram2" 00:34:59.204 }, 00:34:59.204 { 00:34:59.204 "nbd_device": "/dev/nbd3", 00:34:59.204 "bdev_name": "crypto_ram3" 00:34:59.204 } 00:34:59.204 ]' 00:34:59.204 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:59.204 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:59.204 { 00:34:59.204 "nbd_device": "/dev/nbd0", 00:34:59.204 "bdev_name": "crypto_ram" 00:34:59.204 }, 00:34:59.204 { 00:34:59.204 "nbd_device": "/dev/nbd1", 00:34:59.204 "bdev_name": "crypto_ram1" 00:34:59.204 }, 00:34:59.204 { 00:34:59.204 "nbd_device": "/dev/nbd2", 00:34:59.204 "bdev_name": "crypto_ram2" 00:34:59.204 }, 00:34:59.204 { 00:34:59.204 "nbd_device": "/dev/nbd3", 00:34:59.204 "bdev_name": "crypto_ram3" 00:34:59.204 } 00:34:59.204 ]' 00:34:59.204 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:59.462 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:34:59.462 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:59.462 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:34:59.462 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:59.462 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:59.462 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:59.462 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:59.719 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:59.719 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:59.719 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:59.719 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:59.719 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:59.719 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:59.719 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:59.719 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:59.719 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:59.719 20:48:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:59.977 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:59.977 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:59.977 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:59.977 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:59.977 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:59.977 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:59.978 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:59.978 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:59.978 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:59.978 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:35:00.235 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:35:00.235 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:35:00.235 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:35:00.235 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:00.235 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:00.235 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:35:00.235 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:00.235 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:00.235 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:00.235 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:00.493 20:48:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:00.751 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:00.752 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:35:01.365 /dev/nbd0 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:01.365 1+0 records in 00:35:01.365 1+0 records out 00:35:01.365 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000336189 s, 12.2 MB/s 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:01.365 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:35:01.624 /dev/nbd1 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:01.624 1+0 records in 00:35:01.624 1+0 records out 00:35:01.624 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000327979 s, 12.5 MB/s 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:01.624 20:48:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:35:01.882 /dev/nbd10 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:01.882 1+0 records in 00:35:01.882 1+0 records out 00:35:01.882 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00029592 s, 13.8 MB/s 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:01.882 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:35:02.140 /dev/nbd11 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:02.140 1+0 records in 00:35:02.140 1+0 records out 00:35:02.140 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000323006 s, 12.7 MB/s 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:02.140 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:02.398 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:35:02.398 { 00:35:02.398 "nbd_device": "/dev/nbd0", 00:35:02.398 "bdev_name": "crypto_ram" 00:35:02.398 }, 00:35:02.398 { 00:35:02.398 "nbd_device": "/dev/nbd1", 00:35:02.398 "bdev_name": "crypto_ram1" 00:35:02.398 }, 00:35:02.398 { 00:35:02.398 "nbd_device": "/dev/nbd10", 00:35:02.398 "bdev_name": "crypto_ram2" 00:35:02.398 }, 00:35:02.398 { 00:35:02.398 "nbd_device": "/dev/nbd11", 00:35:02.398 "bdev_name": "crypto_ram3" 00:35:02.398 } 00:35:02.398 ]' 00:35:02.398 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:35:02.398 { 00:35:02.398 "nbd_device": "/dev/nbd0", 00:35:02.398 "bdev_name": "crypto_ram" 00:35:02.398 }, 00:35:02.398 { 00:35:02.398 "nbd_device": "/dev/nbd1", 00:35:02.399 "bdev_name": "crypto_ram1" 00:35:02.399 }, 00:35:02.399 { 00:35:02.399 "nbd_device": "/dev/nbd10", 00:35:02.399 "bdev_name": "crypto_ram2" 00:35:02.399 }, 00:35:02.399 { 00:35:02.399 "nbd_device": "/dev/nbd11", 00:35:02.399 "bdev_name": "crypto_ram3" 00:35:02.399 } 00:35:02.399 ]' 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:35:02.399 /dev/nbd1 00:35:02.399 /dev/nbd10 00:35:02.399 /dev/nbd11' 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:35:02.399 /dev/nbd1 00:35:02.399 /dev/nbd10 00:35:02.399 /dev/nbd11' 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:35:02.399 256+0 records in 00:35:02.399 256+0 records out 00:35:02.399 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104715 s, 100 MB/s 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:02.399 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:35:02.657 256+0 records in 00:35:02.657 256+0 records out 00:35:02.657 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0838674 s, 12.5 MB/s 00:35:02.657 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:02.657 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:35:02.657 256+0 records in 00:35:02.657 256+0 records out 00:35:02.657 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0662819 s, 15.8 MB/s 00:35:02.657 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:02.657 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:35:02.657 256+0 records in 00:35:02.658 256+0 records out 00:35:02.658 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0586978 s, 17.9 MB/s 00:35:02.658 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:02.658 20:48:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:35:02.916 256+0 records in 00:35:02.916 256+0 records out 00:35:02.916 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0556697 s, 18.8 MB/s 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:02.916 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:03.175 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:03.175 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:03.175 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:03.175 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:03.175 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:03.175 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:03.175 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:03.175 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:03.175 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:03.175 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:35:03.433 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:35:03.433 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:35:03.433 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:35:03.433 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:03.433 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:03.433 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:35:03.433 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:03.433 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:03.433 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:03.433 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:35:03.691 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:35:03.691 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:35:03.691 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:35:03.691 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:03.691 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:03.691 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:35:03.691 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:03.691 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:03.691 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:03.691 20:48:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:03.949 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:35:04.207 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:35:04.465 malloc_lvol_verify 00:35:04.465 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:35:04.723 8a3c7a0b-00f2-4d39-bc83-0764c8d4c6e1 00:35:04.723 20:48:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:35:04.980 14001301-b9d6-4d36-8591-3b67a5174e13 00:35:04.980 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:35:05.239 /dev/nbd0 00:35:05.239 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:35:05.239 mke2fs 1.46.5 (30-Dec-2021) 00:35:05.239 Discarding device blocks: 0/4096 done 00:35:05.239 Creating filesystem with 4096 1k blocks and 1024 inodes 00:35:05.239 00:35:05.239 Allocating group tables: 0/1 done 00:35:05.239 Writing inode tables: 0/1 done 00:35:05.239 Creating journal (1024 blocks): done 00:35:05.240 Writing superblocks and filesystem accounting information: 0/1 done 00:35:05.240 00:35:05.240 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:35:05.240 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:35:05.240 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:05.240 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:35:05.240 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:05.240 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:05.240 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:05.240 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1555116 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1555116 ']' 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1555116 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1555116 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1555116' 00:35:05.499 killing process with pid 1555116 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1555116 00:35:05.499 20:48:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1555116 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:35:06.066 00:35:06.066 real 0m10.864s 00:35:06.066 user 0m14.321s 00:35:06.066 sys 0m4.383s 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:35:06.066 ************************************ 00:35:06.066 END TEST bdev_nbd 00:35:06.066 ************************************ 00:35:06.066 20:48:58 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:06.066 20:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:35:06.066 20:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:35:06.066 20:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:35:06.066 20:48:58 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:35:06.066 20:48:58 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:35:06.066 20:48:58 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:06.066 20:48:58 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:06.066 ************************************ 00:35:06.066 START TEST bdev_fio 00:35:06.066 ************************************ 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:06.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:35:06.066 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:06.067 ************************************ 00:35:06.067 START TEST bdev_fio_rw_verify 00:35:06.067 ************************************ 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:35:06.067 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:06.331 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:06.331 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:06.331 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:35:06.331 20:48:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:06.588 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:06.588 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:06.588 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:06.588 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:06.588 fio-3.35 00:35:06.588 Starting 4 threads 00:35:21.446 00:35:21.446 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1557158: Mon Jul 15 20:49:11 2024 00:35:21.446 read: IOPS=17.0k, BW=66.6MiB/s (69.8MB/s)(666MiB/10001msec) 00:35:21.446 slat (usec): min=17, max=364, avg=80.74, stdev=47.78 00:35:21.446 clat (usec): min=24, max=2354, avg=434.56, stdev=288.72 00:35:21.446 lat (usec): min=61, max=2476, avg=515.30, stdev=315.68 00:35:21.446 clat percentiles (usec): 00:35:21.446 | 50.000th=[ 355], 99.000th=[ 1401], 99.900th=[ 1565], 99.990th=[ 1762], 00:35:21.446 | 99.999th=[ 2245] 00:35:21.446 write: IOPS=18.8k, BW=73.5MiB/s (77.1MB/s)(718MiB/9763msec); 0 zone resets 00:35:21.446 slat (usec): min=27, max=1881, avg=95.78, stdev=50.45 00:35:21.446 clat (usec): min=27, max=3474, avg=488.90, stdev=318.43 00:35:21.446 lat (usec): min=83, max=3627, avg=584.68, stdev=347.40 00:35:21.446 clat percentiles (usec): 00:35:21.446 | 50.000th=[ 416], 99.000th=[ 1598], 99.900th=[ 1762], 99.990th=[ 1876], 00:35:21.446 | 99.999th=[ 3326] 00:35:21.446 bw ( KiB/s): min=60264, max=94986, per=97.35%, avg=73293.16, stdev=2280.23, samples=76 00:35:21.446 iops : min=15066, max=23746, avg=18323.26, stdev=570.03, samples=76 00:35:21.446 lat (usec) : 50=0.01%, 100=1.29%, 250=25.69%, 500=38.43%, 750=19.60% 00:35:21.446 lat (usec) : 1000=8.30% 00:35:21.446 lat (msec) : 2=6.68%, 4=0.01% 00:35:21.446 cpu : usr=99.51%, sys=0.01%, ctx=67, majf=0, minf=277 00:35:21.446 IO depths : 1=6.7%, 2=26.7%, 4=53.3%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:21.446 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:21.446 complete : 0=0.0%, 4=88.2%, 8=11.8%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:21.446 issued rwts: total=170450,183766,0,0 short=0,0,0,0 dropped=0,0,0,0 00:35:21.446 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:21.446 00:35:21.446 Run status group 0 (all jobs): 00:35:21.446 READ: bw=66.6MiB/s (69.8MB/s), 66.6MiB/s-66.6MiB/s (69.8MB/s-69.8MB/s), io=666MiB (698MB), run=10001-10001msec 00:35:21.446 WRITE: bw=73.5MiB/s (77.1MB/s), 73.5MiB/s-73.5MiB/s (77.1MB/s-77.1MB/s), io=718MiB (753MB), run=9763-9763msec 00:35:21.446 00:35:21.446 real 0m13.662s 00:35:21.446 user 0m46.173s 00:35:21.446 sys 0m0.507s 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:35:21.446 ************************************ 00:35:21.446 END TEST bdev_fio_rw_verify 00:35:21.446 ************************************ 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "de9fc9d0-7b57-561a-8902-a67803b920fd"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "de9fc9d0-7b57-561a-8902-a67803b920fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "1efc1b38-69f7-55a3-a894-817e962a3265"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1efc1b38-69f7-55a3-a894-817e962a3265",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "bb6d3ec8-8672-5575-9b5f-54fc01e0d7a9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "bb6d3ec8-8672-5575-9b5f-54fc01e0d7a9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "29dc53a1-e973-55d5-bb90-89ea620765b8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "29dc53a1-e973-55d5-bb90-89ea620765b8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:21.446 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:35:21.446 crypto_ram1 00:35:21.446 crypto_ram2 00:35:21.446 crypto_ram3 ]] 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "de9fc9d0-7b57-561a-8902-a67803b920fd"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "de9fc9d0-7b57-561a-8902-a67803b920fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "1efc1b38-69f7-55a3-a894-817e962a3265"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1efc1b38-69f7-55a3-a894-817e962a3265",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "bb6d3ec8-8672-5575-9b5f-54fc01e0d7a9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "bb6d3ec8-8672-5575-9b5f-54fc01e0d7a9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "29dc53a1-e973-55d5-bb90-89ea620765b8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "29dc53a1-e973-55d5-bb90-89ea620765b8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:21.447 ************************************ 00:35:21.447 START TEST bdev_fio_trim 00:35:21.447 ************************************ 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:35:21.447 20:49:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:21.447 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:21.447 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:21.447 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:21.447 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:21.447 fio-3.35 00:35:21.447 Starting 4 threads 00:35:33.630 00:35:33.630 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1559017: Mon Jul 15 20:49:25 2024 00:35:33.630 write: IOPS=28.2k, BW=110MiB/s (116MB/s)(1103MiB/10001msec); 0 zone resets 00:35:33.630 slat (usec): min=11, max=1558, avg=83.71, stdev=48.95 00:35:33.630 clat (usec): min=28, max=2135, avg=298.99, stdev=184.56 00:35:33.630 lat (usec): min=48, max=2238, avg=382.71, stdev=215.66 00:35:33.630 clat percentiles (usec): 00:35:33.630 | 50.000th=[ 255], 99.000th=[ 873], 99.900th=[ 1254], 99.990th=[ 1450], 00:35:33.630 | 99.999th=[ 1663] 00:35:33.630 bw ( KiB/s): min=92192, max=137408, per=100.00%, avg=113460.21, stdev=2736.59, samples=76 00:35:33.630 iops : min=23048, max=34352, avg=28365.05, stdev=684.15, samples=76 00:35:33.630 trim: IOPS=28.2k, BW=110MiB/s (116MB/s)(1103MiB/10001msec); 0 zone resets 00:35:33.630 slat (usec): min=4, max=411, avg=21.55, stdev= 9.35 00:35:33.630 clat (usec): min=48, max=2238, avg=382.96, stdev=215.72 00:35:33.630 lat (usec): min=54, max=2254, avg=404.51, stdev=219.70 00:35:33.630 clat percentiles (usec): 00:35:33.630 | 50.000th=[ 334], 99.000th=[ 1057], 99.900th=[ 1516], 99.990th=[ 1729], 00:35:33.630 | 99.999th=[ 2024] 00:35:33.630 bw ( KiB/s): min=92192, max=137408, per=100.00%, avg=113460.21, stdev=2736.59, samples=76 00:35:33.630 iops : min=23048, max=34352, avg=28365.05, stdev=684.15, samples=76 00:35:33.630 lat (usec) : 50=0.41%, 100=4.40%, 250=35.01%, 500=41.54%, 750=13.75% 00:35:33.630 lat (usec) : 1000=3.89% 00:35:33.630 lat (msec) : 2=1.01%, 4=0.01% 00:35:33.630 cpu : usr=99.42%, sys=0.00%, ctx=52, majf=0, minf=95 00:35:33.630 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:33.630 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:33.630 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:33.630 issued rwts: total=0,282377,282378,0 short=0,0,0,0 dropped=0,0,0,0 00:35:33.630 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:33.630 00:35:33.630 Run status group 0 (all jobs): 00:35:33.630 WRITE: bw=110MiB/s (116MB/s), 110MiB/s-110MiB/s (116MB/s-116MB/s), io=1103MiB (1157MB), run=10001-10001msec 00:35:33.630 TRIM: bw=110MiB/s (116MB/s), 110MiB/s-110MiB/s (116MB/s-116MB/s), io=1103MiB (1157MB), run=10001-10001msec 00:35:33.630 00:35:33.630 real 0m13.614s 00:35:33.630 user 0m46.260s 00:35:33.630 sys 0m0.518s 00:35:33.630 20:49:25 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:33.630 20:49:25 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:35:33.630 ************************************ 00:35:33.630 END TEST bdev_fio_trim 00:35:33.630 ************************************ 00:35:33.630 20:49:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:35:33.630 20:49:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:35:33.630 20:49:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:33.630 20:49:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:35:33.630 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:33.630 20:49:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:35:33.630 00:35:33.630 real 0m27.635s 00:35:33.630 user 1m32.611s 00:35:33.630 sys 0m1.227s 00:35:33.630 20:49:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:33.630 20:49:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:33.630 ************************************ 00:35:33.630 END TEST bdev_fio 00:35:33.630 ************************************ 00:35:33.630 20:49:25 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:33.630 20:49:25 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:35:33.630 20:49:25 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:35:33.630 20:49:25 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:33.630 20:49:25 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:33.630 20:49:25 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:33.630 ************************************ 00:35:33.630 START TEST bdev_verify 00:35:33.630 ************************************ 00:35:33.630 20:49:25 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:35:33.888 [2024-07-15 20:49:26.040886] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:35:33.888 [2024-07-15 20:49:26.040959] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1560336 ] 00:35:33.888 [2024-07-15 20:49:26.168977] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:34.147 [2024-07-15 20:49:26.270577] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:34.147 [2024-07-15 20:49:26.270583] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:34.147 [2024-07-15 20:49:26.291957] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:34.147 [2024-07-15 20:49:26.299983] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:34.147 [2024-07-15 20:49:26.308017] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:34.147 [2024-07-15 20:49:26.407282] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:36.743 [2024-07-15 20:49:28.604176] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:36.743 [2024-07-15 20:49:28.604261] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:36.743 [2024-07-15 20:49:28.604277] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:36.743 [2024-07-15 20:49:28.612191] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:36.743 [2024-07-15 20:49:28.612210] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:36.743 [2024-07-15 20:49:28.612222] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:36.743 [2024-07-15 20:49:28.620212] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:36.743 [2024-07-15 20:49:28.620235] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:36.743 [2024-07-15 20:49:28.620247] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:36.743 [2024-07-15 20:49:28.628234] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:36.743 [2024-07-15 20:49:28.628253] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:36.743 [2024-07-15 20:49:28.628265] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:36.743 Running I/O for 5 seconds... 00:35:42.010 00:35:42.010 Latency(us) 00:35:42.010 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:42.010 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:42.010 Verification LBA range: start 0x0 length 0x1000 00:35:42.010 crypto_ram : 5.07 468.73 1.83 0.00 0.00 271676.03 4274.09 165036.74 00:35:42.010 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:42.010 Verification LBA range: start 0x1000 length 0x1000 00:35:42.010 crypto_ram : 5.08 378.31 1.48 0.00 0.00 337160.52 17210.32 205156.17 00:35:42.010 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:42.010 Verification LBA range: start 0x0 length 0x1000 00:35:42.010 crypto_ram1 : 5.07 473.07 1.85 0.00 0.00 268824.71 5784.26 153183.28 00:35:42.010 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:42.010 Verification LBA range: start 0x1000 length 0x1000 00:35:42.010 crypto_ram1 : 5.08 378.21 1.48 0.00 0.00 335948.10 18236.10 193302.71 00:35:42.010 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:42.010 Verification LBA range: start 0x0 length 0x1000 00:35:42.010 crypto_ram2 : 5.05 3650.68 14.26 0.00 0.00 34747.02 6667.58 28038.01 00:35:42.010 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:42.010 Verification LBA range: start 0x1000 length 0x1000 00:35:42.010 crypto_ram2 : 5.06 2958.36 11.56 0.00 0.00 42800.46 9289.02 35332.45 00:35:42.010 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:42.010 Verification LBA range: start 0x0 length 0x1000 00:35:42.010 crypto_ram3 : 5.06 3657.93 14.29 0.00 0.00 34588.91 1994.57 28151.99 00:35:42.010 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:42.010 Verification LBA range: start 0x1000 length 0x1000 00:35:42.010 crypto_ram3 : 5.07 2967.42 11.59 0.00 0.00 42563.47 1246.61 34420.65 00:35:42.010 =================================================================================================================== 00:35:42.010 Total : 14932.72 58.33 0.00 0.00 68083.61 1246.61 205156.17 00:35:42.010 00:35:42.010 real 0m8.221s 00:35:42.010 user 0m15.585s 00:35:42.010 sys 0m0.373s 00:35:42.010 20:49:34 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:42.010 20:49:34 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:35:42.010 ************************************ 00:35:42.010 END TEST bdev_verify 00:35:42.010 ************************************ 00:35:42.010 20:49:34 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:42.010 20:49:34 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:42.010 20:49:34 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:42.010 20:49:34 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:42.010 20:49:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:42.010 ************************************ 00:35:42.010 START TEST bdev_verify_big_io 00:35:42.010 ************************************ 00:35:42.010 20:49:34 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:42.010 [2024-07-15 20:49:34.344998] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:35:42.010 [2024-07-15 20:49:34.345060] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1561352 ] 00:35:42.269 [2024-07-15 20:49:34.475533] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:42.269 [2024-07-15 20:49:34.577997] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:42.269 [2024-07-15 20:49:34.578003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:42.269 [2024-07-15 20:49:34.599401] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:42.269 [2024-07-15 20:49:34.607431] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:42.269 [2024-07-15 20:49:34.615459] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:42.528 [2024-07-15 20:49:34.720979] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:45.059 [2024-07-15 20:49:36.925949] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:45.059 [2024-07-15 20:49:36.926032] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:45.059 [2024-07-15 20:49:36.926047] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:45.059 [2024-07-15 20:49:36.933967] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:45.059 [2024-07-15 20:49:36.933987] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:45.059 [2024-07-15 20:49:36.933999] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:45.059 [2024-07-15 20:49:36.941988] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:45.059 [2024-07-15 20:49:36.942006] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:45.059 [2024-07-15 20:49:36.942018] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:45.059 [2024-07-15 20:49:36.950008] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:45.059 [2024-07-15 20:49:36.950025] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:45.059 [2024-07-15 20:49:36.950037] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:45.059 Running I/O for 5 seconds... 00:35:45.627 [2024-07-15 20:49:37.866799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.867383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.867502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.867576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.867647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.867715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.868310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.868337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.872646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.872727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.872795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.872862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.873399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.873496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.873563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.873651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.874222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.874248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.878346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.878416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.878484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.878550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.879152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.879219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.879287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.879376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.880004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.880029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.884228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.884298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.884364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.884431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.884986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.885053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.885122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.885212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.885739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.885764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.889902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.889982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.890050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.890122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.890728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.890794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.890861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.890939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.891569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.891594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.895684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.895756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.895825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.895893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.896518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.896597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.896670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.896743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.897279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.897304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.901487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.901563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.901631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.901697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.902314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.902382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.902450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.902517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.903102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.627 [2024-07-15 20:49:37.903128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.907229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.907299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.907375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.907444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.908013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.908080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.908148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.908221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.908784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.908808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.912795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.912864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.912939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.913009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.913618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.913685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.913753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.913823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.914468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.914495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.918486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.918555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.918625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.918696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.919223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.919289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.919359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.919442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.920004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.920030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.923896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.923970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.924047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.924117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.924715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.924781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.924861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.924936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.925512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.925538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.929512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.929584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.929653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.929719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.930357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.930443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.930524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.930595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.931142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.931170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.935048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.935117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.935184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.935249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.935792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.935857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.935923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.936001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.936541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.936567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.940504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.940584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.940691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.940764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.941387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.941454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.941522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.941592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.942177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.942203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.945365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.945432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.945498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.945564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.946160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.946225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.946291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.946356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.946777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.946801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.950323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.950391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.950459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.950526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.950951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.951023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.951092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.951161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.951525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.951549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.954246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.954352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.954424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.954494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.955116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.955183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.628 [2024-07-15 20:49:37.955250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.955319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.955941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.955968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.958801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.958868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.958950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.959030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.959449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.959514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.959580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.959649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.960022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.960047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.963644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.963713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.963780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.963845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.964271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.964337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.964412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.964479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.964910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.964948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.968028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.968097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.968166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.968232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.968907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.968981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.969049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.969124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.969559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.969584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.972200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.972277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.972346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.972412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.972837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.972905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.972993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.973066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.973608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.973636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.976812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.976880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.976963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.977036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.977514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.977583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.977653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.977722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.978137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.978162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.981316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.981385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.981453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.981522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.982119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.982189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.982255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.982319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.982717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.982742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.985308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.985376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.985444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.985509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.986037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.986103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.986169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.986237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.986802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.986827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.989864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.989940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.990008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.990080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.990596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.990662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.990728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.990802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.991177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.991202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.994524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.994592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.994661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.994732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.995157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.995237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.995305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.995372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.995734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.995759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.998163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.998238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.998308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.998380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.999054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.629 [2024-07-15 20:49:37.999123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:37.999190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:37.999259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:37.999809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:37.999834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:38.002585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:38.002654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:38.002724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:38.002790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:38.003246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:38.003312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:38.003379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:38.003443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.630 [2024-07-15 20:49:38.003809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.003833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.007296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.007367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.007434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.007499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.007963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.008028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.008101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.008176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.008538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.008562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.011113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.011205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.011276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.011344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.011944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.012011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.012081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.012152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.012775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.012801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.015367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.015435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.015501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.015566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.016027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.016093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.016159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.016235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.016596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.016621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.020029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.020097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.020164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.020215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.020676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.020740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.889 [2024-07-15 20:49:38.020838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.020905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.021279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.021304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.024230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.024748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.025263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.027044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.029152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.030863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.032432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.034072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.034441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.034466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.039416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.041226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.042990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.044318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.046286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.048106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.049631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.050149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.050696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.050722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.054382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.055940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.057648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.059409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.060457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.060977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.061481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.063187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.063554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.063579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.067546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.068067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.068573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.069091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.071014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.072763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.074586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.076024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.076441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.076465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.079877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.081608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.083366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.085119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.087287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.089083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.090836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.092182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.092764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.092794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.097341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.098416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.099979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.101719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.103142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.103656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.104177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.104875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.105288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.105313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.109268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.110559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.111091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.111602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.113642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.115247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.117068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.118608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.118990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.119015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.122191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.123016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.124576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.126331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.127757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.129308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.131056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.132810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.133352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.133378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.138024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.139766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.141002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.142552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.144648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.145174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.890 [2024-07-15 20:49:38.145685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.146198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.146668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.146693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.150682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.152442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.153388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.153898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.154974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.156695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.158429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.160185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.160707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.160733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.163639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.164158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.165629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.167197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.169306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.170658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.172207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.173966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.174335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.174360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.179029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.180787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.182222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.184038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.186210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.187821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.188341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.188856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.189427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.189459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.193154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.194908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.196660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.197252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.198288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.199060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.200619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.202367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.202740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.202765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.205585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.206105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.206615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.208424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.210588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.212198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.213842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.215557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.215933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.215959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.220858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.222628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.224180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.225651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.227752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.228378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.228892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.229411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.229943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.229974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.233592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.234122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.234639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.235153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.236144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.236653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.237165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.237688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.238208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.238233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.241936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.242453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.242970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.243481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.244459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.244979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.245488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.246006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.246571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.246599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.250096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.250609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.251120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.251640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.891 [2024-07-15 20:49:38.252684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.253197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.253706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.254229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.254872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.254902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.258437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.258981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.259494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.260015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.261112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.261623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.262152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.262663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.263226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:45.892 [2024-07-15 20:49:38.263253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.266865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.267389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.267915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.268436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.269490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.270014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.270526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.271049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.271620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.271648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.275157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.275674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.276197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.276708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.277786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.278315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.278836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.279349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.279903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.279937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.283388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.283905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.284425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.284944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.286020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.286535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.287056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.287565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.288140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.288167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.291676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.292200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.292715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.293231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.294316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.294843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.295362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.295872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.296395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.296425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.299964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.300486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.301010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.301521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.302561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.303089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.303598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.304126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.304690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.304717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.308281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.308802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.309318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.309832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.310788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.151 [2024-07-15 20:49:38.311310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.311819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.313319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.313806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.313831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.318084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.319643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.321458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.323088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.325010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.326760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.328582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.329102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.329674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.329700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.333891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.335447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.337075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.338895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.339762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.340284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.340791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.342555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.342932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.342958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.346793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.347825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.348357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.348863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.351038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.352786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.354535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.355628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.356050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.356075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.359073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.359963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.361515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.363266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.364703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.366260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.368018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.369793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.370346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.370372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.374762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.376519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.377617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.379181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.381283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.382085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.382597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.383110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.383616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.383641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.387314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.389143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.390735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.390803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.391909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.392427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.394186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.396008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.396377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.396402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.399569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.400092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.400600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.401455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.401525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.401959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.403736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.405491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.406667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.408187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.408554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.408579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.411363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.411433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.411501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.411570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.411945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.412036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.412103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.412173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.412241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.412607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.412631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.414750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.414818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.414886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.414957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.415434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.415520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.415586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.415653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.415722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.416287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.416313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.418764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.418840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.418910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.418985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.419470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.419560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.419623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.419692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.419760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.420162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.420186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.422729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.422797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.422865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.422943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.423512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.423593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.423665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.423733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.423810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.424270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.424295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.426323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.426399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.426469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.426536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.426906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.427011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.427080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.427171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.427245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.427832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.427857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.430481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.430573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.430639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.430704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.431076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.431165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.431228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.431307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.431378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.431767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.431792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.434103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.434175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.434242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.434313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.434908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.434999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.435072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.435142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.435213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.152 [2024-07-15 20:49:38.435608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.435632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.437686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.437759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.437833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.437901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.438275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.438362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.438427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.438507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.438577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.439042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.439068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.442050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.442122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.442193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.442263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.442627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.442721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.442785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.442864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.442944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.443384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.443410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.445531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.445601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.445668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.445746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.446321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.446406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.446473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.446544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.446612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.447204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.447231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.449349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.449415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.449482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.449547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.449950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.450038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.450103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.450193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.450257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.450618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.450643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.453610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.453682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.453754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.453822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.454237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.454322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.454386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.454456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.454533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.454903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.454939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.457046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.457119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.457195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.457267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.457811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.457898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.457974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.458045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.458115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.458651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.458676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.460944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.461012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.461083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.461161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.461598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.461686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.461763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.461841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.461914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.462295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.462324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.465049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.465120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.465188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.465260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.465652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.465735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.465807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.465898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.465973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.466339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.466370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.468502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.468581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.468664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.468732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.469250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.469349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.469413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.469483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.469552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.470138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.470165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.472833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.472902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.472993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.473061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.473473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.473564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.473636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.473705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.473773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.474184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.474210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.476688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.476760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.476828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.476899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.477513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.477604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.477677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.477758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.477827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.478246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.478272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.480394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.480465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.480533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.480602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.480971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.481062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.481130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.481200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.481269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.481811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.481836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.484664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.484733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.484805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.484874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.485245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.485334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.485410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.485479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.485547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.486055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.486081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.488260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.153 [2024-07-15 20:49:38.488327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.488396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.488465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.489005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.489103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.489167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.489235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.489307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.489863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.489887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.491992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.492061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.492128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.492201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.492567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.492661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.492727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.492795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.492864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.493241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.493267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.496398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.496467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.496533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.496599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.496981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.497069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.497143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.497211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.497278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.497643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.497670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.499826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.499896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.499993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.500080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.500687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.500769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.500839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.500909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.500987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.501574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.501603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.503773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.503843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.503911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.503998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.504380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.504476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.504539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.504606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.504676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.505047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.505072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.507918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.507995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.508063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.508133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.508497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.508590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.508658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.508727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.508794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.509171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.509196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.511286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.511355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.511421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.511486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.511975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.512063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.512127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.512195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.512264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.512847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.512872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.515318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.515394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.515467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.515536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.516055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.516142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.516209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.516278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.516344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.516755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.516782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.519386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.519454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.519520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.519592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.520176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.520267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.520330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.520397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.520472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.520881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.520906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.522958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.523027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.523094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.523163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.523529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.523617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.523688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.523758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.523827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.524401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.524428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.527008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.154 [2024-07-15 20:49:38.527084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.528849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.528919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.529487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.529574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.529640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.529706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.529775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.530213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.530238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.532882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.532960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.533027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.533640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.534086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.534175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.534246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.534313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.534380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.534741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.534765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.538246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.538763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.539286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.539837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.540264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.542043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.543791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.544874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.546433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.546801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.546825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.549873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.551646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.553395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.554465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.554904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.556700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.558504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.559024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.559532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.560150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.560176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.563474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.564004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.564517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.565035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.565618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.566156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.566685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.567203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.567712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.568286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.568311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.571711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.572240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.572754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.573271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.573804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.413 [2024-07-15 20:49:38.574357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.574874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.575394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.575904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.576445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.576471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.580041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.580561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.581079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.581588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.582077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.582606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.583127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.583640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.584163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.584650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.584677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.588194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.588720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.589233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.589744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.590305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.590834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.591351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.591860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.592387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.592949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.592979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.596328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.596841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.597359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.597871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.598392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.598917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.599435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.599950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.600463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.601056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.601087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.604469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.604998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.605510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.606032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.606613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.607151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.607658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.608181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.608717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.609300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.609335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.612892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.613414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.613932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.614454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.615003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.615529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.616049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.616560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.617086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.617621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.617646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.621089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.621613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.622138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.622651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.623163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.623690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.624209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.624725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.625250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.625876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.625906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.629498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.630025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.630542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.631063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.631702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.632240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.632764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.633294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.633802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.634386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.634413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.637875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.638412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.638924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.639439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.640005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.641587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.642378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.643570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.644095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.644606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.644631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.648083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.648956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.650516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.652269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.652630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.653729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.414 [2024-07-15 20:49:38.655289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.657065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.658814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.659356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.659382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.663819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.665582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.666684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.668245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.668606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.670390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.670950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.671463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.671995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.672462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.672486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.676067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.677824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.679585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.680109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.680684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.681218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.682702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.684285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.686107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.686475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.686502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.689031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.689558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.690078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.691689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.692065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.693843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.695018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.696568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.698334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.698697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.698722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.703505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.705273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.707036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.708162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.708548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.710336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.712091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.713194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.713712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.714241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.714268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.717642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.719202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.720953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.722706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.723183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.723716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.724239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.724968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.726531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.726896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.726920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.730732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.731262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.731775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.732299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.732750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.734321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.736079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.737830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.739055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.739494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.739519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.742508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.743819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.745368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.747118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.747482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.748785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.750340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.752093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.753853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.754377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.754402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.758757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.760495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.761579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.763122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.763490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.765265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.765781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.766297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.766806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.767245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.767270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.770896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.772725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.774280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.774794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.775332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.775865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.415 [2024-07-15 20:49:38.777359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.778962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.780786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.781164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.781190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.783723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.784247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.784833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.786395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.786762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.788546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.416 [2024-07-15 20:49:38.789637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.791190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.792956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.793323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.793347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.798171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.799943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.801690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.802814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.803197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.804980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.806735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.807855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.808386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.808960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.808987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.812213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.813763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.815509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.817257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.817774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.818314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.818830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.819588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.821140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.821505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.821530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.825332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.825849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.826369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.826880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.827304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.828873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.830630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.832437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.833938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.834357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.834382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.837385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.838774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.840284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.842040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.842404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.843754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.845316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.847081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.848768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.849374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.849401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.853710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.855471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.856716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.858239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.858613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.860397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.860915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.861436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.677 [2024-07-15 20:49:38.861961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.862388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.862412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.866073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.867886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.869382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.869900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.870413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.870945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.872604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.874362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.876125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.876531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.876557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.879143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.879662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.880317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.881881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.882254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.884040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.885105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.886650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.888412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.888796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.888821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.893462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.895221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.895293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.897037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.897517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.899091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.900834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.902588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.903312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.903889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.903915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.908151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.909232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.910790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.910862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.911237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.913015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.913699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.914214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.914739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.915209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.915234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.917332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.917407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.917479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.917553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.917919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.918019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.918086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.918155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.918227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.918593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.918625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.921589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.921657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.921729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.921804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.922179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.922268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.922331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.922397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.922467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.922829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.922853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.924939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.925008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.925075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.925144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.925731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.925820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.925887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.925963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.926046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.926646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.926675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.928766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.928836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.928904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.928985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.929382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.929471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.929535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.929612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.929681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.930101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.930129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.932946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.933019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.933087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.933157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.933529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.933638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.933705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.933771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.933837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.678 [2024-07-15 20:49:38.934218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.934246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.936311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.936379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.936445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.936510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.937052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.937141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.937206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.937275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.937343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.937889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.937914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.940299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.940369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.940436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.940504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.941059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.941154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.941218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.941285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.941351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.941786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.941811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.944414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.944482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.944554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.944624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.945196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.945282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.945350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.945418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.945485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.945903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.945936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.948051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.948119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.948192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.948266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.948815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.948908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.948982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.949050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.949120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.949685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.949710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.951912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.951984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.952065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.952145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.952626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.952713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.952779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.952855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.952933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.953303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.953331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.956128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.956196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.956263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.956330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.956892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.957001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.957093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.957166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.957235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.957726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.957751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.960805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.960888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.960981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.961048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.961571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.961664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.961733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.961800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.961871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.962400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.962426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.965432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.965502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.965585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.965651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.966240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.966332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.966396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.966468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.966571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.967093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.967118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.970207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.970302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.970379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.970449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.970908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.971005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.971071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.971140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.971209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.971771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.679 [2024-07-15 20:49:38.971797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.974836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.974907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.974981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.975052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.975545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.975635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.975702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.975771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.975840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.976399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.976428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.979501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.979571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.979645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.979715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.980235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.980328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.980421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.980495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.980564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.981102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.981128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.983961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.984034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.984102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.984168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.984735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.984824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.984889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.984968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.985039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.985616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.985642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.988739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.988807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.988875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.988951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.989509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.989599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.989701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.989777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.989849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.990329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.990355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.993501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.993596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.993683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.993762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.994279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.994368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.994433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.994503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.994572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.995069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.995095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.998078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.998148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.998221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.998293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.998909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.999002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.999071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.999141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.999211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.999722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:38.999747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.002781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.002860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.002964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.003038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.003560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.003656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.003720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.003787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.003855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.004448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.004474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.007413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.007480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.007547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.007614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.008142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.008234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.008298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.008365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.008434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.009005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.009032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.011957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.012025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.012094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.012162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.012655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.012746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.012823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.012895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.012992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.013540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.013565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.016462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.680 [2024-07-15 20:49:39.016537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.016610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.016682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.017299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.017383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.017450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.017517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.017588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.018221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.018248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.021103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.021170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.021237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.021308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.021859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.021955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.022033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.022130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.022210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.022737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.022762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.025814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.025884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.025971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.026049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.026594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.026686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.026749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.026818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.026886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.027459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.027490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.030391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.030458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.030526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.030595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.031176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.031261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.031327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.031399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.031471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.031936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.031961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.034923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.034996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.035083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.035177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.035751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.035842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.035910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.035988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.036072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.036674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.036698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.039582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.039649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.039715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.039784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.040361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.040466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.040541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.040614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.040680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.041261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.041290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.044149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.044221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.044287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.044357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.044814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.044903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.044976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.045057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.045141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.045690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.045714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.048655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.048732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.049261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.049329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.049838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.049923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.681 [2024-07-15 20:49:39.049999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.682 [2024-07-15 20:49:39.050068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.682 [2024-07-15 20:49:39.050137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.682 [2024-07-15 20:49:39.050670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.682 [2024-07-15 20:49:39.050695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.053675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.053744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.053812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.055117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.055601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.055700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.055768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.055837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.055903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.056348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.056374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.941 [2024-07-15 20:49:39.060030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.060547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.061063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.062013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.062468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.064248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.065996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.067086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.068632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.069004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.069029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.072418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.073974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.075717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.077469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.078000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.079576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.081332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.083086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.083647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.084201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.084227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.088526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.089628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.091176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.092937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.093305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.093861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.094382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.094898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.096182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.096594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.096619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.100376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.101919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.102447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.102972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.103536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.105051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.106663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.108468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.109975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.110342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.110367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.113233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.113748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.115413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.117169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.117537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.118780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.120388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.122134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.123884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.124369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.124396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.128890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.130658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.132052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.133859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.134243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.136019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.137369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.137888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.138406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.138972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.138999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.142640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.144391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.146143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.146931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.147495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.148037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.148608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.150166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.151913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.152286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.152310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.154999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.155521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.156059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.157296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.157735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.159515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.161268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.162480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.164041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.164418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.164443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.168315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.169870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.171631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.173443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.173876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.175443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.177197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.179008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.179518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.180086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.180113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.942 [2024-07-15 20:49:39.184439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.185542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.187096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.188832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.189207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.189777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.190291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.190813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.192057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.192508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.192533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.196293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.197697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.198218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.198732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.199301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.200886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.202625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.204391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.205807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.206181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.206206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.208965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.209485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.211241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.212996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.213364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.214754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.216557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.218312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.220074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.220516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.220542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.225017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.226832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.228402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.230116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.230485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.232278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.233930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.234444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.234966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.235539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.235564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.239248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.240994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.242752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.244137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.244741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.245277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.245788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.247461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.249197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.249562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.249587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.252739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.253267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.253779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.254547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.254954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.256733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.258456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.259535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.261087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.261455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.261480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.264536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.266101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.267839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.269592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.270130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.271696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.273394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.275140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.276019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.276605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.276631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.280948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.282047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.283591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.285346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.285716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.286558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.287072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.287581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.288568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.288996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.289021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.292774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.294563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.295085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.295597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.296199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.297332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.298892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.300647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.302450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.302908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.302939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.305752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.943 [2024-07-15 20:49:39.306290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.944 [2024-07-15 20:49:39.308078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.944 [2024-07-15 20:49:39.309894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.944 [2024-07-15 20:49:39.310270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.944 [2024-07-15 20:49:39.311867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.944 [2024-07-15 20:49:39.313567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.944 [2024-07-15 20:49:39.315322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.944 [2024-07-15 20:49:39.317080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.944 [2024-07-15 20:49:39.317481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:46.944 [2024-07-15 20:49:39.317512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.321909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.323666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.325417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.326724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.327141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.328917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.330667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.331189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.331712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.332325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.332351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.336006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.337662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.339458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.340931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.341492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.342028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.342536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.344091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.345737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.346110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.346135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.349419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.349965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.350481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.351024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.351427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.353192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.354924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.356013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.357537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.357913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.357943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.361100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.362660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.364410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.366162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.366703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.368277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.370037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.371784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.372582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.373155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.373181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.207 [2024-07-15 20:49:39.377339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.378527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.380086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.381835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.382213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.383329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.383841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.384369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.384891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.385267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.385292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.388327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.388847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.389371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.390264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.390683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.392491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.394243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.395040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.396580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.396954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.396978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.400106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.400622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.401142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.401656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.402240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.402771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.403296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.403806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.404326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.404880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.404908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.408359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.408894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.409427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.409947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.410513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.411049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.411562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.412081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.412608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.413178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.413203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.416569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.417099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.417613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.418138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.418708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.419243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.419757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.420275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.420809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.421335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.421363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.424825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.425354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.425866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.426385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.426911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.427457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.427978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.428489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.429006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.429458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.429482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.432854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.433382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.433894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.434409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.434952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.435483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.436011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.436522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.437037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.437571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.437597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.441162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.442178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.442251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.442959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.443519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.445257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.445767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.446286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.447078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.447470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.447495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.450840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.452319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.452840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.452911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.453379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.454528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.455051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.455565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.456924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.457450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.457475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.208 [2024-07-15 20:49:39.460173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.460254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.460341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.460419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.460845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.460937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.461001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.461076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.461152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.461736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.461769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.464400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.464470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.464539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.464606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.464980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.465072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.465136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.465201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.465272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.465830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.465859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.468682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.468753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.468821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.468888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.469410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.469504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.469570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.469642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.469713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.470186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.470211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.472993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.473062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.473130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.473204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.473830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.473922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.473999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.474069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.474138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.474503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.474528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.477392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.477481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.477557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.477626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.478198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.478291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.478360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.478441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.478509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.478890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.478916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.481579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.481650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.481718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.481792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.482326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.482414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.482484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.482550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.482620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.483201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.483228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.485871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.485948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.486019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.486086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.486467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.486554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.486627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.486703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.486773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.487336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.487365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.490050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.490122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.490195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.490270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.490647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.490739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.490803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.490874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.490956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.491513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.491539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.494470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.494544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.494613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.494685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.495217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.495318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.495384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.495453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.495524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.496030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.496057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.498713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.209 [2024-07-15 20:49:39.498788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.498855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.498939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.499561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.499651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.499718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.499789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.499860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.500274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.500298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.502807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.502879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.502955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.503029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.503395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.503482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.503561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.503628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.503695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.504262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.504287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.507122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.507192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.507266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.507338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.507873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.507982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.508049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.508117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.508186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.508764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.508795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.511608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.511677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.511744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.511809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.512181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.512291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.512364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.512434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.512504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.513007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.513033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.515140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.515207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.515275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.515342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.515891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.515994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.516062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.516130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.516195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.516742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.516769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.518949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.519018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.519085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.519150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.519548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.519636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.519726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.519790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.519862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.520239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.520264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.523275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.523347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.523419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.523487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.523882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.523976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.524042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.524109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.524191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.524552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.524577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.526722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.526790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.526867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.526945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.527464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.527560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.527624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.527695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.527766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.528294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.528319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.530560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.530628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.530696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.530761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.531279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.531374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.531437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.531514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.531587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.531968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.531992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.534607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.534677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.210 [2024-07-15 20:49:39.534748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.534819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.535308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.535393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.535457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.535527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.535601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.535976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.536002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.538098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.538165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.538237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.538312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.538675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.538764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.538829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.538897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.538985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.539576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.539602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.542156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.542226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.542292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.542367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.542735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.542835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.542904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.542981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.543065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.543453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.543481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.545906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.545980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.546047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.546131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.546748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.546838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.546904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.546985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.547054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.547424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.547449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.549644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.549712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.549779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.549844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.550219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.550309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.550376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.550446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.550512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.551069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.551095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.553852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.553942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.554026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.554093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.554457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.554545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.554609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.554685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.554756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.555202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.555228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.557388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.557456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.557523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.557591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.558165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.558254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.558334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.558401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.558469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.559034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.559060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.561153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.561223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.561289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.561355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.561777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.561864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.561935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.562011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.211 [2024-07-15 20:49:39.562086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.562456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.562480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.565437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.565508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.565579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.565647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.566097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.566185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.566249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.566318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.566390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.566761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.566788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.568861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.568937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.569010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.569088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.569612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.569702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.569770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.569841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.569909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.570417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.570442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.572753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.572819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.572888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.572960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.573431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.573521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.573593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.573662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.573737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.574114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.574139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.576784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.576853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.576921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.576996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.577546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.577636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.577704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.577769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.577837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.578248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.578276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.580397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.212 [2024-07-15 20:49:39.580465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.472 [2024-07-15 20:49:39.582223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.472 [2024-07-15 20:49:39.582293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.472 [2024-07-15 20:49:39.582833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.472 [2024-07-15 20:49:39.582921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.472 [2024-07-15 20:49:39.582991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.472 [2024-07-15 20:49:39.583061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.583129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.583676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.583702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.586196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.586273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.586344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.587441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.587855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.587955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.588020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.588087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.588152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.588517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.588541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.592271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.593832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.595587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.597343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.597809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.599388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.601140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.602888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.603408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.603956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.603986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.608274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.609374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.610923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.612686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.613057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.613666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.614184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.614692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.616038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.616441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.616466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.620250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.621547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.622086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.622599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.623162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.624963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.626775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.628522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.629813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.630187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.630213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.633137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.633694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.635258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.637010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.637374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.638488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.640051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.641788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.643536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.644087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.644113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.648646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.650404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.651624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.653264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.653627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.655414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.656634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.657153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.657667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.658212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.658244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.661870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.663623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.665370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.665924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.666466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.667000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.667863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.669425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.671178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.671544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.671569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.674201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.674719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.675235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.677043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.677410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.679207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.680665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.682478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.684294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.684659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.684683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.689065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.690725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.692506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.693972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.694337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.695960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.697778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.473 [2024-07-15 20:49:39.699397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.699921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.700458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.700483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.704215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.705959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.707745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.709496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.709895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.710433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.710952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.711461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.713181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.713547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.713573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.717416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.718109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.718620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.719142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.719623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.721188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.722966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.724709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.725854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.726269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.726294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.729305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.730520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.732080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.733833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.734205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.735495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.737049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.738797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.740571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.741129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.741154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.745485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.747235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.748447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.750007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.750369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.752148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.752662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.753180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.753698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.754112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.754137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.758025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.759780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.761040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.761553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.762105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.762633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.764455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.766248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.767831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.768215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.768241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.771264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.772447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.773004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.774571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.774941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.776726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.777821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.779392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.781148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.781516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.781541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.786352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.788174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.789934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.791104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.791471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.793111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.794910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.796418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.796937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.797482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.797509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.800935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.801458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.801986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.802498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.803057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.803583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.804109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.804634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.805163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.805727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.805755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.809126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.809646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.810174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.810688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.811232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.811763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.812293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.812809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.474 [2024-07-15 20:49:39.813324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.813904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.813939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.817414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.817939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.818455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.818976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.819571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.820107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.820634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.821155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.821665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.822212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.822239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.825557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.826106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.826627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.827150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.827749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.828282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.828796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.829320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.829830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.830393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.830421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.833805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.834343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.834851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.835371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.835937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.836466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.836991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.837502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.838041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.838571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.838597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.842011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.842529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.843054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.843563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.844101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.844631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.845156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.845665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.846185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.846743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.475 [2024-07-15 20:49:39.846773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.850372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.850898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.851418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.851938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.852425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.852957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.853472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.854002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.854517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.855019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.855045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.858556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.859102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.859614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.860141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.860678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.861227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.861739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.862260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.862775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.863321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.863350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.866848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.867384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.867896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.868415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.868923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.869456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.869977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.870487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.871016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.871499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.871524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.874863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.875390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.875898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.877330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.877912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.879758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.880295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.880807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.881326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.881830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.881856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.885376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.885907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.886425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.886943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.887350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.888890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.890646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.892421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.893932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.737 [2024-07-15 20:49:39.894355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.894380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.897548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.899135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.900796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.902615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.902986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.904500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.906111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.907936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.909478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.910046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.910076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.914383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.916036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.917707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.919419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.919785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.921504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.922022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.922526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.923051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.923419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.923444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.927380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.929129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.929856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.930422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.931064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.932065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.933623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.934533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.936280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.936717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.936746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.939418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.939950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.941083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.942639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.943012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.944786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.945948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.947503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.949258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.949627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.949659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.954158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.955894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.955973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.957646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.958093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.959650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.961407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.963234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.963750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.964321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.964347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.968320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.969988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.971712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.971790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.972169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.973587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.974105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.974612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.975213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.975623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.975649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.977860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.977937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.978014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.978086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.978456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.978547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.978614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.978681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.978766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.979335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.979361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.982062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.982143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.982215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.982285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.982645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.982732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.982797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.982867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.982956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.983393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.983417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.985857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.985938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.986007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.986077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.986638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.986724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.986790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.986859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.986938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.987325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.738 [2024-07-15 20:49:39.987350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.989515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.989584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.989651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.989717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.990089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.990182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.990261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.990330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.990395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.990862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.990887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.993772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.993841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.993908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.993984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.994348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.994447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.994515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.994584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.994651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.995176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.995202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.997486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.997557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.997625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.997693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.998214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.998298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.998364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.998436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.998507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.999061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:39.999088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.001284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.001359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.001434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.001515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.001922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.002030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.002100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.002164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.002252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.002694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.002726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.005941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.006015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.006081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.006148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.006525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.006624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.006693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.006758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.006821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.007359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.007385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.009769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.009830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.009884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.009947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.010484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.010554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.010606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.010659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.010712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.011262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.011284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.013415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.013479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.013531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.013592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.013944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.014013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.014085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.014137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.014189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.014530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.014550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.017618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.017680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.017732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.017783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.018213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.018286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.018339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.018391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.018442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.018783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.018805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.020919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.020987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.021044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.021097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.021610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.021683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.021735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.021788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.021840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.022383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.022407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.739 [2024-07-15 20:49:40.025069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.025129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.025200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.025254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.025619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.025696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.025751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.025814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.025869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.026257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.026279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.029584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.029687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.029755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.029821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.030445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.030545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.030611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.030733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.030811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.031393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.031457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.033845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.033909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.033972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.034024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.034365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.034435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.034488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.034552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.034607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.035141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.035164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.037826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.037885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.037954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.038012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.038350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.038420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.038473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.038524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.038575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.039105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.039127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.043841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.043906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.043965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.044018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.044452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.044527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.044580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.044631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.044683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.045040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.045061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.050615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.050679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.050734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.050786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.051365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.051432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.051486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.051539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.051592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.052164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.052187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.056899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.056970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.057022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.057074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.057418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.057491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.057547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.057598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.057650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.058097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.058119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.062908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.062977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.063034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.063086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.063552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.063623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.063675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.063727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.063778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.064156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.064177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.069110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.069172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.069238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.069290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.069635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.069716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.069772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.069825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.069876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.070219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.070241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.075619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.740 [2024-07-15 20:49:40.075683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.075737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.075801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.076387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.076454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.076508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.076560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.076612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.077004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.077026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.082137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.082202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.082254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.082306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.082835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.082909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.082968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.083021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.083072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.083595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.083625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.088130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.088194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.088249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.088308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.088652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.088720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.088783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.088836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.088887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.089231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.089253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.093277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.093339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.093391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.093441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.093779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.093860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.093917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.093977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.094030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.094371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.094391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.099192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.099255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.099311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.099363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.099772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.099845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.099898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.099957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.100014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.100353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.100374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.106297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.106361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.106414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.106467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.106965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.107031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.107083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.107135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.107187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.107730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.107752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.112757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.112820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.112875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:47.741 [2024-07-15 20:49:40.112934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.113276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.113348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.113406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.113460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.113512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.114045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.114066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.118468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.118533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.118585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.118640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.118997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.119092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.119152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.119203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.119255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.119595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.119616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.124626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.124688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.124740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.124793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.125138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.125211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.125270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.125324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.125376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.125800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.125821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.130772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.130835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.132397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.132463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.132808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.132882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.132941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.132993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.133044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.133517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.133538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.135713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.135771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.135827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.136331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.136904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.136981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.137036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.137088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.137140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.137518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.137539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.141220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.143043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.144663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.145171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.145703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.146216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.146711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.147216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.147711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.148228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.148250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.151766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.152277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.152777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.153279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.153773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.154284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.154780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.155288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.155785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.156319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.156342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.159772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.160286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.160785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.161306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.161772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.162285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.162782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.163286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.163783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.164344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.164370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.167779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.168287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.168786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.169288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.169790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.002 [2024-07-15 20:49:40.170302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.170797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.171316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.171811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.172400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.172423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.175862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.176370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.176871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.177372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.177904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.178424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.178937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.179434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.179934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.180513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.180536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.184085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.184587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.185093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.185590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.186161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.186670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.187194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.187690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.188190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.188746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.188767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.192257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.192768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.193275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.193771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.194370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.194888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.195394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.195890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.196391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.196941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.196965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:48.003 [2024-07-15 20:49:40.310139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.311697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.315277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.316408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.316473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.316841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.317241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.317638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.317708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.318088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.318150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.318515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.318576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.320109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.321855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.322145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.322167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.322187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.328160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.328687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.329091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.329485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.329890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.330306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.330877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.332321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.334102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.334383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.334405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.334426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.337786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.339547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.341321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.341721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.342224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.342637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.343037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.343441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.344389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.344690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.344714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.344736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.350463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.352040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.353656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.354055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.354542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.354954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.355349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.355743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.356839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.357170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.357193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.357214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.359644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.003 [2024-07-15 20:49:40.360876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.362432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.363980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.364259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.364813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.365218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.365614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.366014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.366469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.366493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.366519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.371303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.372538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.374101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.375655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.375937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.376523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.376920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.377323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.377717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.378169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.378193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.004 [2024-07-15 20:49:40.378219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.265 [2024-07-15 20:49:40.381386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.265 [2024-07-15 20:49:40.382821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.265 [2024-07-15 20:49:40.383944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.265 [2024-07-15 20:49:40.385168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.265 [2024-07-15 20:49:40.385448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.265 [2024-07-15 20:49:40.387130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.265 [2024-07-15 20:49:40.388748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.265 [2024-07-15 20:49:40.389170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.265 [2024-07-15 20:49:40.389572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.265 [2024-07-15 20:49:40.390030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.390057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.390084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.395273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.396405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.397827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.399055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.399335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.400955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.402595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.402997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.403402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.403867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.403896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.403918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.408008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.409789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.411578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.412334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.412616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.414020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.415561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.417115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.417853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.418316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.418341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.418363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.422604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.424168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.425563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.426684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.427053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.428847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.430611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.432389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.432790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.433283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.433311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.433335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.437065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.438844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.440623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.442399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.442888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.444590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.446197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.447750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.449298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.449704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.449727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.449748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.453880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.455447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.456970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.458047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.458358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.459594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.461152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.462701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.464256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.464699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.464723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.464744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.468864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.470212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.471765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.473317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.473666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.474920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.476154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.477717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.479268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.479548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.479575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.479597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.482898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.483308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.485080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.486553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.486833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.488111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.489479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.490702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.492244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.492528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.492551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.492572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.495196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.495600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.496438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.497671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.497955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.499522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.500126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.501902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.266 [2024-07-15 20:49:40.503321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.503602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.503624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.503645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.508093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.508502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.508908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.509328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.509836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.510259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.510653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.511054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.511449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.511914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.511946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.511979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.514675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.515097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.515154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.515554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.515973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.516437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.516463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.516872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.517275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.517332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.517723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.518130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.518532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.518556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.518578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.518599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.522321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.522388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.522787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.522843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.523285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.523311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.523720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.523884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.524288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.524340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.524799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.524823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.524849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.524870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.527530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.527589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.528018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.528071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.528522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.528550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.528968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.529043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.529442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.529507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.529943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.529980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.530013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.530036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.533411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.533487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.533884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.533945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.534399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.534423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.534827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.534879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.535279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.535339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.535702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.535725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.535755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.535777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.538598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.538663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.539066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.539120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.539474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.539507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.539917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.539985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.540381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.540432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.540886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.540910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.540943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.540966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.544537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.544601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.545005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.545055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.545556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.545580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.545993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.546048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.546445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.546508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.546901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.267 [2024-07-15 20:49:40.546924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.546956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.546976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.549821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.549882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.550285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.550344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.550761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.550784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.551215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.551273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.551663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.551717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.552174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.552211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.552231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.552253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.555708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.555769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.556169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.556225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.556690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.556714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.557134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.557191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.557598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.557654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.558077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.558101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.558123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.558144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.561236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.561307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.561704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.561758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.562148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.562172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.562580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.562634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.563042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.563093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.563595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.563620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.563647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.563668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.567109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.567177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.567575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.567628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.568032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.568056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.568473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.568530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.568924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.568981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.569426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.569456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.569482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.569508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.572239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.572304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.572697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.572750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.573212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.573241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.573647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.573708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.574106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.574158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.574609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.574638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.574663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.574685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.578084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.578150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.578548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.578604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.579003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.579027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.579435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.579489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.579882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.579942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.580402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.580425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.580447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.580468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.583208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.583266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.583660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.583714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.584191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.584215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.584630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.584686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.585085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.585134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.585512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.268 [2024-07-15 20:49:40.585535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.585557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.585579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.589315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.589379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.589772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.589827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.590276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.590300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.590704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.590757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.591156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.591218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.591728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.591755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.591782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.591803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.594372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.594434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.594823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.594884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.595385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.595412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.595818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.595870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.596280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.596336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.596685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.596710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.596731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.596756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.600242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.600308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.600704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.600752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.601207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.601236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.601639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.601701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.602100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.602155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.602601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.602624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.602650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.602671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.605241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.605305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.605354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.605402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.605891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.605917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.606332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.606425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.606475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.606531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.606985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.607012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.607033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.607053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.610395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.610464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.610545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.610608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.611101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.611135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.611203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.611265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.611338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.611399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.611809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.611843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.611864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.611885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.614639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.614695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.614743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.614816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.615117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.615139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.615201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.615248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.615294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.615340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.615784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.615810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.615833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.615858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.619141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.619215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.619263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.619334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.619611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.619634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.619692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.619739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.619785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.619858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.620140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.269 [2024-07-15 20:49:40.620163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.620183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.620203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.622212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.622264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.622317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.622370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.622822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.622847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.622907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.622964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.623019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.623067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.623530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.623553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.623574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.623594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.627288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.627344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.627396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.627443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.627840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.627863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.627932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.627985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.628032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.628078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.628356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.628378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.628411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.628433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.629921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.629981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.630032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.630080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.630494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.630518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.630574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.630621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.630669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.630721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.631171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.631195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.631222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.631244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.635189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.635245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.635292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.635338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.635617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.635639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.635702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.635749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.635796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.635842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.636233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.636257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.636278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.636298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.637838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.637902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.637959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.638006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.638278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.638300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.638376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.638426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.638478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.638526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.638940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.638964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.638987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.270 [2024-07-15 20:49:40.639007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.641749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.641806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.641852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.641898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.642182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.642203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.642266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.642318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.642364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.642411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.642685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.642707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.642728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.642748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.644432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.644488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.644543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.644590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.644861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.644883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.644944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.645003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.529 [2024-07-15 20:49:40.645053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.645100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.645372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.645394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.645414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.645434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.648606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.648661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.648708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.648755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.649082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.649105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.649170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.649217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.649264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.649310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.649591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.649613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.649633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.649653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.651331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.651383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.651743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.651793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.652103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.652128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.741338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.741406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.741466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.741908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.742233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.742294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.743580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.743632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.743690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.745201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.745481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.745504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.749362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.749433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.751088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.751137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.751201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.752880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.753166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.753189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.753256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.754799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.754850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.755353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.755633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.755656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.760487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.760569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.760940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.761001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.761059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.761425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.761877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.761905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.761934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.761960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.762021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.763761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.763811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.765603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.765882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.765903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.770338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.771918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.771978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.772389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.772911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.772945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.772967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.772993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.773053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.773453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.773519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.773914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.774374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.774402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.779235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.780935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.780987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.782754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.783043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.783065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.783086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.783106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.783167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.784733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.784785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.785200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.785717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.785745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.788965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.790506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.790558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.792104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.530 [2024-07-15 20:49:40.792525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.792549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.792570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.792592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.792679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.794340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.794389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.796167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.796450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.796472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.800666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.801070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.801125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.802679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.802965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.802988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.803008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.803029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.803089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.804630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.804680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.806225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.806624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.806647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.811688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.812102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.812499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.812909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.813360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.813385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.813411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.813433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.813493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.814906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.816140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.817687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.817972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.817995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.824019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.824426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.824823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.825223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.825714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.825738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.825765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.825787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.826198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.827969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.829423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.830978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.831258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.831280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.836462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.836868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.837274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.837669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.838139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.838167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.838190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.838215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.838695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.840236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.841971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.843641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.843932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.843954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.848573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.849006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.849412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.849805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.850260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.850286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.850314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.850336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.851042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.852355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.854039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.855764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.856052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.856075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.860317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.860725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.861125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.861523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.861990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.862019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.862043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.862069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.862923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.864166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.865719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.867272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.867557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.867580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.871579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.871992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.872392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.872791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.873253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.873286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.531 [2024-07-15 20:49:40.873310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.873335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.874371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.875606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.877153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.878706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.878993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.879016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.883052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.883454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.883849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.884269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.884732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.884757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.884784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.884806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.885923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.887136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.888658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.890210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.890493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.890516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.894651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.895066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.895460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.895859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.896322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.896350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.896376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.896398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.897711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.898946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.900511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.902062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.902350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.902387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.532 [2024-07-15 20:49:40.906960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.907374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.907771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.908176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.908636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.908664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.908687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.908711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.910492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.911972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.913524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.915081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.915502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.915526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.920921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.921334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.921731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.922132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.922496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.922518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.922539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.922559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.923791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.925343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.926933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.928589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.929001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.929024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.934037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.934443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.934839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.935242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.935527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.935550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.935570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.935591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.936803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.938338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.939885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.941253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.941604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.941626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.946481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.946892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.947296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.947686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.947970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.947993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.948014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.948034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.949260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.950806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.952360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.953495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.953829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.953856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.958773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.959185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.959582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.959637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.960097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.960120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.960140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.960161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.961725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.963484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.965213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.965276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.965556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.965579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.969725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.969809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.970212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.970265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.970734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.970762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.970785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.970809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.971233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.971289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.971682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.971735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.972071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.972094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.977165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.977232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.978775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.795 [2024-07-15 20:49:40.978826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.979140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.979165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.979201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.979224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.979633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.980037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.980091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.980486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.980962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.980991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.984631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.984711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.985109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.985161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.985615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.985641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.985666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.985687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.986095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.986149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.986547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.986950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.987370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.987392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.991079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.991488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.991559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.991958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.992425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.992451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.992476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.992501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.992563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.992960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.993357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.993412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.993820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.993843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.997394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.997464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.997872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.998285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.998808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.998834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.998859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.998881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.999289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.999682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:40.999734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.000150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.000573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.000596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.003692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.004104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.004521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.004577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.005004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.005029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.005055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.005076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.005506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.005564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.005962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.006357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.006830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.006857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.010300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.010720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.010776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.012536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.012818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.012840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.012860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.012881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.012952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.013462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.014931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.014986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.015265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.015288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.019322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.019388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.019784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.020188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.020628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.020656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.020679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.020705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.021114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.021514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.021572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.021971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.022325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.022349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.025669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.026077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.026478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.026534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.026973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.026997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.796 [2024-07-15 20:49:41.027019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.027041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.027446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.027502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.027894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.028290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.028766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.028793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.032181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.032582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.032640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.033043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.033489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.033512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.033534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.033555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.033616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.034019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.034426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.034481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.034966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.034995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.038454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.038520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.038914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.039318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.039753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.039776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.039797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.039827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.040239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.040635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.040691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.041098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.041602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.041629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.044832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.045260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.045656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.045708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.046083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.046108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.046131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.046152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.046566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.046622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.047019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.047422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.047879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.047902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.051381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.051784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.051842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.052242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.052618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.052641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.052662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.052683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.052764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.053166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.053564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.053617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.054092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.054119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.057729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.057791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.058187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.058596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.059062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.059091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.059116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.059141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.059546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.059603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.060001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.060064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.060476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.060509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.063541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.063958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.064361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.064422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.064884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.064912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.064942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.064968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.065371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.065426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.065822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.065874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.066238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.066273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.069855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.069941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.070335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.070403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.070848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.070883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.070908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.070945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.797 [2024-07-15 20:49:41.071351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.071405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.071797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.071858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.072378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.072417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.075954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.076015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.076410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.076464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.076806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.076834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.076855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.076877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.077294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.077352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.077744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.077793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.078255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.078283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.081826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.081886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.082298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.082348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.082842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.082868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.082894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.082916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.083323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.083375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.083778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.083834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.084278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.084317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.087748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.087813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.088212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.088264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.088714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.088738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.088764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.088790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.089212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.089265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.089658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.089710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.090176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.090205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.094362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.094428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.094481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.094531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.094987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.095015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.095037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.095058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.095466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.095521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.095590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.095637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.096070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.096094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.099768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.100051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.100074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.105885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.106405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.106431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.798 [2024-07-15 20:49:41.109745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.109791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.110076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.110103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.114557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.114619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.114674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.114721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.115003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.115026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.115046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.115066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.115130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.115190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.115243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.115292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.115710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.115734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.118549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.118606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.118652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.118699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.118982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.119006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.119026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.119046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.119109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.119156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.119202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.119249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.119521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.119543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.123517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.123578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.123625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.123681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.124083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.124105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.124126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.124145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.124205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.124253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.124299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.124346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.124663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.124687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.127798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.127854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.127901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.127953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.128286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.128309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.128329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.128349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.128412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.128459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.128506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.128552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.128829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.128851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.133405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.133461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.133512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.133559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.133840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.133862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.133882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.133902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.133970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.134018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.134068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.134136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.134553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.134589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.137470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.137529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.137576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.137622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.137897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.137918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.137943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.799 [2024-07-15 20:49:41.137964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.138027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.138074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.138120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.138166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.138439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.138460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.143160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.143217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.143264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.143311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.143772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.143796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.143816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.143840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.143905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.143959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.144007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.144053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.144333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.144355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.147729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.147784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.147831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.147878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.148243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.148266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.148286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.148306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.148367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.148415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.148461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.148508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.148785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.148807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.153390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.153445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.153496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.155042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.155451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.155474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.155496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.155517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.155602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.155667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.155719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.156124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.156586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.156613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.160154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.161703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.161756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.162240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.162522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.162544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.162565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.162585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.162647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.164163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.164215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.165844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.166130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.166153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:48.800 [2024-07-15 20:49:41.169100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.169505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.169559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.169956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.170264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.170286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.170307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.170326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.170388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.170435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.171668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.171721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.172012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.172035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.176522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.178098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.178153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.178723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.179229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.179253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.179280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.179302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.179368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.179769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.179822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.179884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.180390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.180414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.182967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.183036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.183791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.183842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.184130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.184155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.184188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.184221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.184633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.184688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.184742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.185222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.185501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.185524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.189322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.191057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.191120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.191169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.191598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.191620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.191641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.191661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.191719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.191766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.193272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.193324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.193604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.193626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.198104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.198170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.198224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.199723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.200083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.200106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.200126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.200146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.200211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.201761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.201813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.201860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.202140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.202163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.206420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.206476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.207036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.207095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.207375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.207398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.207418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.063 [2024-07-15 20:49:41.207438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.207866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.207918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.207976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.208372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.208657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.208679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.213018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.214567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.214621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.214668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.215030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.215055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.215089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.215110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.215174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.215226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.216798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.216876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.217160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.217183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.221549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.221611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.221665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.222235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.222516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.222542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.222563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.222583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.222648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.224298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.224350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.224413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.224692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.224715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.229250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.229307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.230731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.230784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.231144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.231167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.231187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.231207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.232360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.232414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.232462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.232860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.233273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.233295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.238557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.240111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.240166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.240212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.240492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.240515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.240542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.240562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.240633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.240688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.242174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.242225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.242553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.242575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.247028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.247089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.247144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.247541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.247880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.247903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.247923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.247950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.248011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.249221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.249274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.249321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.249598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.249621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.254422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.254479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.256030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.256082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.256432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.256456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.256478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.256499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.257835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.257890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.257948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.258352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.258806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.258834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.262063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.263620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.263676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.263724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.264007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.264030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.264051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.264071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.264134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.265304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.265359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.266932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.267280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.267303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.271822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.271881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.271944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.272344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.272672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.272696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.272717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.272737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.272797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.274020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.274074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.275621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.275902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.275938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.280740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.281800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.281855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.283388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.283795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.283819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.283841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.283862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.283933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.284332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.284383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.285586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.285888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.285913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.289251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.290910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.290972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.292061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.292400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.292423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.292444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.292464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.292528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.294091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.294143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.295688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.295974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.295996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.299270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.300499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.300558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.302107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.302388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.302410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.302431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.302451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.302516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.304014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.304066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.064 [2024-07-15 20:49:41.305298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.305632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.305655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.309593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.310013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.310067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.310943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.311223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.311246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.311268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.311298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.311365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.311762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.311817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.312318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.312597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.312619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.317123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.318880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.320587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.322276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.322748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.322772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.322792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.322815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.322893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.323293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.323685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.324105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.324562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.324586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.330012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.331254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.332803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.334344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.334624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.334646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.334667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.334689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.335472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.336730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.337134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.337530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.337811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.337833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.343125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.344156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.345699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.346951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.347233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.347256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.347276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.347304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.348873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.350128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.350527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.350922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.351337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.351360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.356274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.356829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.358492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.360081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.360361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.360384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.360404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.360424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.361996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.362897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.364411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.364949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.365412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.365439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.369988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.371638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.373327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.374202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.374484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.374509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.374530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.374551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.376348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.378143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.379924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.380328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.380836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.380863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.385799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.387364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.388075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.389852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.390169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.390192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.390212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.390233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.391803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.393338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.394343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.395756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.396137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.396161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.400193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.401793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.403437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.405122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.405519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.405541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.405561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.405582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.406952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.408660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.410414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.412181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.412659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.412683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.416968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.418525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.420084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.420666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.420952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.420975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.420996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.421016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.422253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.423806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.425394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.426262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.426545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.426567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.429223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.429629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.430665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.431895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.432198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.432222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.432242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.432262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.433992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.435720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.436556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.437791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.438080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.065 [2024-07-15 20:49:41.438103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.440216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.440631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.441033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.441430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.441709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.441732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.441753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.441773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.442999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.444542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.446102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.447371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.447706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.447729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.450619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.451689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.452654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.453062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.453507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.453530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.453551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.453571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.455249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.455651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.456055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.457646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.457984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.458007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.461258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.462569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.464123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.465682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.466043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.466068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.466090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.466112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.466522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.466921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.467324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.467722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.468180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.468204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.470241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.471722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.473505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.473567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.473846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.473869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.473889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.473909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.475472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.475960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.477702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.477768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.478272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.478301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.480830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.327 [2024-07-15 20:49:41.480889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.482492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.482544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.482823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.482845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.482870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.482890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.484455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.484510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.484939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.484994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.485270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.485292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.487418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.487488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.487879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.488131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.488545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.488579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.488600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.488622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.489044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.489458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.489513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.489933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.490306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.490330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.494111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.494175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.494575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.494627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.495093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.495117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.495143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.495165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.495576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.495632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.496038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.496435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.496879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.496904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.499634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.500587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.500643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.501559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.502036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.502061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.502088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.502109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.502171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.502816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.504201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.504254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.504729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.504754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.507352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.507414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.507803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.508206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.508664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.508692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.508714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.508735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.509156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.509559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.509616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.511055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.511433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.511456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.513616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.514031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.514430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.514499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.514842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.514874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.514895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.514916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.515333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.515387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.515784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.516186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.516670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.516693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.520165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.520568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.520626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.521022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.521304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.521326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.521346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.521366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.521423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.521821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.522229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.522282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.522640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.522668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.524883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.524955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.525352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.328 [2024-07-15 20:49:41.525744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.526194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.526224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.526246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.526273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.526995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.528289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.528342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.529890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.530182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.530205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.531973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.533255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.534307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.534362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.534673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.534698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.534720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.534741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.535155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.535212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.535829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.537240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.537750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.537779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.540360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.540756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.540826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.541227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.541692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.541720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.541746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.541768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.541828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.542240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.542642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.542699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.542992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.543015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.545592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.545654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.546055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.546451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.546982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.547006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.547027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.547047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.547452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.547845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.547901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.548301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.548815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.548843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.551075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.552585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.552991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.553047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.553504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.553534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.553561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.553582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.555302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.555365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.555767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.556179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.556555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.556579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.559404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.559804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.559860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.560268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.560655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.560678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.560711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.560733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.560802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.561882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.562837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.562889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.563320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.563349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.565883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.565956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.566355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.566763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.567273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.567304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.567329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.567358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.567763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.567831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.568232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.568285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.568739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.568767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.570760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.571171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.571569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.571620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.571900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.329 [2024-07-15 20:49:41.571922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.571951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.571971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.572723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.572776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.573176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.573230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.573626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.573649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.576555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.576616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.577012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.577064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.577449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.577471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.577492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.577514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.577923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.577994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.578791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.578843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.579131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.579167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.581769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.581827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.582232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.582305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.582800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.582824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.582848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.582869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.583288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.583357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.583753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.583810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.584267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.584293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.587197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.587256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.588659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.588710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.589200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.589229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.589251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.589276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.589680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.589734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.591412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.591475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.591987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.592023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.594619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.594681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.595090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.595142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.595619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.595644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.595670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.595692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.596102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.596167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.596562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.596625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.596989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.597014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.600802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.600867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.600935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.600984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.601463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.601489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.601514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.601536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.601944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.602008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.602057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.602120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.602595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.602617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.605940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.606294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.606318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.608364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.608417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.608469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.608520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.608938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.608961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.330 [2024-07-15 20:49:41.608982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.609002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.609062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.609109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.609156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.609202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.609499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.609522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.611935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.611989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.612036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.612103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.612631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.612658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.612685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.612710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.612770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.612825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.612876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.612924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.613384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.613407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.615906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.615978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.616030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.616079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.616363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.616386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.616406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.616426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.616485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.616533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.616583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.616629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.617123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.617148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.619327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.619383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.619434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.619486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.619901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.619942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.619968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.619990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.620063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.620129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.620188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.620252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.620645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.620668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.623245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.623303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.623354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.623404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.623864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.623887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.623912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.623944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.624036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.624085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.624167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.624228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.624687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.624722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.626855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.626908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.626967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.627018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.627338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.627360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.627380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.627400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.627463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.627511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.627557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.627603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.628040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.628065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.630483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.630547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.630599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.630647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.631112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.631138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.331 [2024-07-15 20:49:41.631164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.631188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.631248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.631300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.631356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.631404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.631789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.631813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.633966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.634802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.635085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.635108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.637272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.637323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.637371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.637418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.637867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.637893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.637920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.637950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.638009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.638059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.638111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.638159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.638525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.638547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.640285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.640335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.640382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.640428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.640825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.640849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.640871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.640891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.640978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.641030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.641076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.641122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.641399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.641421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.643774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.643825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.643881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.644285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.644631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.644653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.644674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.644694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.644754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.644801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.644851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.646083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.646365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.646387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.648106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.649364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.649416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.650968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.651249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.651271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.651291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.651312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.651376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.652271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.652322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.654016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.654457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.654480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.656659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.657071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.657128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.658897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.659227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.659250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.659271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.659291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.659356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.659403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.660955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.661006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.661283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.661305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.663030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.664806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.664866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.666483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.666883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.666906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.666931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.666952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.332 [2024-07-15 20:49:41.667013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.668293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.668343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.668391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.668864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.668890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.671089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.671145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.672473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.672525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.672875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.672902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.672922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.672948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.674513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.674566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.674613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.676151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.676434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.676457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.678094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.679646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.679697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.679744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.680144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.680169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.680191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.680211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.680290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.680343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.681973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.682023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.682511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.682534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.685144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.685200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.685246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.686475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.686753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.686776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.686796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.686820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.686879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.688650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.688718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.688769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.689053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.689076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.690846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.690896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.692450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.692502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.692785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.692809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.692831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.692862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.693935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.693989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.694036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.694668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.695140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.695167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.697469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.698957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.699009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.699057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.699378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.699401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.699421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.699441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.699504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.699552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.701111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.701163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.701441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.333 [2024-07-15 20:49:41.701464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.704777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.704836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.704893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.706536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.706937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.706960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.706981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.707001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.707062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.708346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.708398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.708446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.708899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.708931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.711103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.711158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.712465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.712516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.712844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.712866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.712886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.712907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.714462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.714514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.714561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.716094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.716383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.716408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.718055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.719603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.719654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.719701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.720111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.720135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.720157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.720178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.720259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.720311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.721941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.721991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.722489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.722517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.725044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.725099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.725146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.726430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.726710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.726733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.726754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.726773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.726831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.728500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.728554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.728612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.728883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.595 [2024-07-15 20:49:41.728905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.730668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.730723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.732267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.732318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.732595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.732617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.732651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.732673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.733540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.733595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.733646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.734459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.734914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.734946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.737130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.738268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.738320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.738368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.738696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.738719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.738740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.738759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.738823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.739873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.739933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.741348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.741680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.741703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.743913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.744007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.744063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.745449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.745818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.745842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.745864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.745885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.745955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.746355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.746408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.746804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.747090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.747113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.749116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.750798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.750864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.751261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.751700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.751729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.751752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.751775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.751834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.753457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.753510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.753913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.754369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.754394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.755959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.757515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.757568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.758046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.758324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.758346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.758371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.758391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.758450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.760169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.760227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.761972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.762250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.762273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.764343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.764962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.765013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.766270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.766753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.766778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.766804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.766829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.766893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.767314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.767365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.768796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.769080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.769103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.770766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.772062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.772116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.773664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.773944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.773966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.773987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.596 [2024-07-15 20:49:41.774007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.774064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.775842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.775900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.776690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.776974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.776996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.779222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.779624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.780096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.781640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.781919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.781946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.781967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.781987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.782050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.783593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.785140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.785769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.786056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.786079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.788469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.789975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.790510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.790905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.791286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.791309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.791329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.791349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.792708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.793113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.793505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.795247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.795535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.795557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.798921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.800398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.801960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.803489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.803853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.803876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.803906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.803932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.805432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.805976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.806373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.807297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.807576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.807600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.811308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.812853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.814413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.815355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.815644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.815668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.815690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.815710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.817481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.819256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.821030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.821583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.821861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.821884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.824411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.824817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.826417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.827648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.827932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.827955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.827975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.827995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.829560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.830841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.832102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.833330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.833609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.833632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.836306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.836706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.837105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.838754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.839290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.839318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.839341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.839366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.839775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.841276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.842493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.844035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.844312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.844335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.847559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.849112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.850167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.851448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.851763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.851787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.851808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.851830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.597 [2024-07-15 20:49:41.852250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.852702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.854299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.854697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.855156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.855182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.858210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.859127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.860709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.861940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.862218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.862240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.862260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.862280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.863828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.865118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.866202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.867157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.867615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.867642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.870553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.871786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.873356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.874994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.875276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.875302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.875323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.875343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.875822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.877436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.879088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.880671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.880955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.880978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.883301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.884960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.885365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.885762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.886111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.886134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.886154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.886174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.887402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.888951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.890511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.892104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.892482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.892504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.895689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.896408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.897729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.898131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.898594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.898623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.898645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.898666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.900352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.900759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.901160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.902439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.902768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.902791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.905369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.906603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.908164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.909717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.910004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.910028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.910049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.910069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.910674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.912109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.912503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.912898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.913188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.913212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.916415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.917969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.919518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.919572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.919918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.919948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.919973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.919996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.921617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.922886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.924446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.924499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.924781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.924803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.927139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.927195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.598 [2024-07-15 20:49:41.928887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.928949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.929460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.929488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.929512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.929538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.929960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.930017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.931684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.931740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.932055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.932078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.935404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.935460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.937239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.937305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.937582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.937605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.937626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.937647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.939200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.939696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.939750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.941356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.941868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.941897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.944435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.944493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.946027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.946094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.946373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.946396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.946417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.946437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.948000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.948054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.949652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.950132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.950412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.950435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.952656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.954363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.954425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.954826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.955284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.955313] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.955336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.955361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.955425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.956857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.957465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.957522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.957992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.958019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.961141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.961203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.962178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.963724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.964084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.964107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.964127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.964148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.965713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.967266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.967319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.968398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.968723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.968745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.599 [2024-07-15 20:49:41.970827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.971519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.971916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.971980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.972329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.972352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.972372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.972392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.973609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.973662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.975214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.976757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.977047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.977071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.980263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.981704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.981759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.982872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.983185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.983210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.983232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.983254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.983320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.983721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.984257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.984309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.984591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.984614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.988410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.988467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.990017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.991539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.991988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.992011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.992032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.992052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.993293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.994880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.994940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.996480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.996760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.996782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:41.999029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:42.000010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:42.001066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:42.001119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:42.001567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:42.001595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:42.001625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:42.001647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:42.002346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.862 [2024-07-15 20:49:42.002399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.003620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.005177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.005455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.005478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.008698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.010334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.010388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.011888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.012349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.012373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.012393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.012413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.012473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.013665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.014064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.014117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.014551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.014574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.017990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.018048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.019590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.021184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.021468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.021490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.021511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.021531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.022013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.023585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.023637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.025379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.025659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.025681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.027329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.027725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.028124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.028179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.028458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.028481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.028501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.028521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.029039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.029096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.029493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.030852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.031196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.031219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.033949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.035193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.035247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.036792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.037077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.037100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.037120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.037141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.037207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.038497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.039566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.039619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.039942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.039967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.042497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.042556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.043435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.044670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.044953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.044975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.044996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.045016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.046672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.046728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.048503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.048561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.048951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.048973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.050502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.052054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.052864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.052916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.053199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.053232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.053265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.053286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.053693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.053747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.054146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.054198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.054476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.054498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.058167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.058225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.059842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.059893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.060361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.060384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.060404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.060424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.061947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.863 [2024-07-15 20:49:42.062004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.063667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.063718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.064000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.064023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.066042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.066098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.066489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.066540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.066817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.066840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.066860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.066880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.067297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.067354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.067750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.067802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.068088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.068111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.070757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.070818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.071217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.071279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.071727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.071752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.071778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.071803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.072704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.072756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.073730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.073781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.074235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.074263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.076772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.076830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.077365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.077416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.077696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.077718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.077739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.077759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.078174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.078232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.078626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.078682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.079095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.079120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.082985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.083044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.083111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.083164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.083679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.083712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.083734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.083760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.084173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.084225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.084286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.084333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.084733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.084757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.087730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.088184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.088211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.090401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.090453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.090505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.090553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.091022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.091050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.091072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.091098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.091163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.091214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.091272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.091324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.091604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.091626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.093945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.093996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.094064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.094125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.094593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.094615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.094636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.094656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.094733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.094781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.864 [2024-07-15 20:49:42.094828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.094876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.095345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.095375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.097530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.097583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.097636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.097688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.098060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.098085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.098108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.098129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.098199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.098248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.098307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.098377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.098722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.098746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.100759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.100824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.100872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.100920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.101410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.101435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.101461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.101483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.101544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.101594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.101647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.101696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.102115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.102139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.104464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.104521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.104571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.104625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.104910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.104937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.104958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.104978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.105037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.105084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.105131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.105186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.105690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.105716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.107592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.107643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.107690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.107744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.108025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.108047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.108067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.108088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.108145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.108192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.108248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.108295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.108777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.108800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.110293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.110345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.110409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.110457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.110868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.110891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.110913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.110938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.110998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.111052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.111108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.111160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.111608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.111632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.113708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.113771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.113818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.113865] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.114151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.114175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.114197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.114217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.114280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.114328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.114374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.114421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.114698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.114720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.116444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.116495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.116541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.116588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.116864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.116887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.116907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.116931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.116994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.117041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.865 [2024-07-15 20:49:42.117088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.117135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.117599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.117621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.119849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.119901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.119954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.120007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.120458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.120483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.120508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.120532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.120595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.120647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.120701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.120751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.121097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.121121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.123562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.123616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.123667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.125024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.125515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.125544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.125568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.125593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.125657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.125711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.125767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.126170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.126607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.126631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.128810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.130037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.130090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.130483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.130930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.130953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.130973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.131007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.131075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.131489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.131544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.131966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.132456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.132479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.134597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.134999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.135062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.135459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.135815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.135838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.135870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.135892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.135966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.136014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.136424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.136480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.136943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.136971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.139025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.139442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.139497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.139909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.140308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.140332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.140354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.140375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.140439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.140842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.140893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.140948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.141228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.141251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.143496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.143548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.143958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.144012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.144456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.144480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.144501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.144523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.866 [2024-07-15 20:49:42.144937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.144991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.145057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.146823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.147344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.147369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.149742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.150161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.150218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.150272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.150714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.150736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.150757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.150777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.150837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.150884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.152252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.152304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.152767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.152795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.155344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.155404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.155454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.156154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.156436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.156459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.156479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.156499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.156579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.156986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.157040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.157091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.157532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.157559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.159707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.159759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.160547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.160601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.160883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.160918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.160959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.160980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.161390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.161446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.161496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.161892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.162288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.162312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.164469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.165700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.165752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.165800] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.166240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.166268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.166291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.166316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.166375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.166427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.166822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.166883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.167288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.167312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.170372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.170430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.170477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.170870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.171310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.171333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.171354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.171374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.171444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.171839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.171899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.171966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.172355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.172377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.174564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.174619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.175023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.175083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.175443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.175466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.175500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.175522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.175937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.176000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.176048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.176442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.176942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.176970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.179209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.179609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.179675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.179724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.180131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.180154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.180177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.180199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.180275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.867 [2024-07-15 20:49:42.180325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.180722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.180777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.181225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.181252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.183721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.183780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.183830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.184232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.184595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.184621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.184642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.184664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.184731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.185131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.185186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.185238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.185523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.185545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.187828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.187908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.188311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.188372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.188818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.188842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.188863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.188884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.189296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.189361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.189417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.190992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.191491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.191515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.193769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.194181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.194238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.194290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.194637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.194660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.194680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.194700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.194770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.195918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.195975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.196370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.196806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.196828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.200027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.200087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.200137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.200775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.201228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.201258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.201283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.201308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.201372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.201768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.201828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.202229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.202679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.202701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.204899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.205308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.205363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.205760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.206172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.206196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.206217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.206237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.206307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.206721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.206776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.207183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.207523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.207546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.209730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.211063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.211117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.211514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.211902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.211943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.211976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.211996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.212065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.212465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.212514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.213993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.214376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.214399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.216727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.217147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.217204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.217601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.217879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.217901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.217922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.217946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.218007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.868 [2024-07-15 20:49:42.218407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.218462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.218859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.219181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.219208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.220812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.221743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.221796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.223154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.223435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.223457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.223477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.223497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.223565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.225120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.225172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.225939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.226370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.226394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.228487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.229243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.230502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.232144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.232425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.232448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.232468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.232488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.232548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.234202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.234722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.236222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.236508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:49.869 [2024-07-15 20:49:42.236530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.238619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.239693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.240644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.241051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.241499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.241522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.241542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.241562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.243131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.244882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.246573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.248211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.248687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.248709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.251758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.252205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.252611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.253499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.253781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.253804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.253839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.253871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.254291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.254684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.256459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.257884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.258172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.258195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.261354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.262905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.264462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.265060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.265557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.265580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.265601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.265625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.266290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.267670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.268070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.268465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.268749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.268771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.271203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.272882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.274166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.275718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.276005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.276028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.276048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.276068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.277279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.277682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.278086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.279830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.280358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.280386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.283785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.285332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.285923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.287621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.287906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.287932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.130 [2024-07-15 20:49:42.287953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.287980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.289548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.291095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.291973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.292367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.292806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.292833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.296348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.298129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.299904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.301680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.302161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.302183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.302203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.302224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.303749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.305519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.307210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.308864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.309347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.309370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.313696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.315079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.316635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.318185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.318527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.318551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.318573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.318605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.320136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.321359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.322907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.324457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.324741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.324765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.327497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.327907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.328314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.328730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.329116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.329151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.329182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.329203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.329613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.330033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.331222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.332056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.332456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.332479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.334980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.335380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.337156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.338932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.339214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.339237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.339257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.339277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.340833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.341544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.343317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.345101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.345390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.345412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.347734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.348144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.349395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.350631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.350913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.350941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.350961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.350981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.352554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.354156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.355061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.356294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.356579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.356602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.358754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.359170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.359571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.361176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.361504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.361527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.361547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.361568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.363133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.364683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.365886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.367208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.367580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.367603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.369505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.369916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.370322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.370380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.370829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.370856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.370883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.131 [2024-07-15 20:49:42.370904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.372681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.374177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.375709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.375762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.376045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.376068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.379587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.379660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.381363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.381415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.381851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.381885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.381916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.381942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.382346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.382413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.382818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.382872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.383319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.383343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.386397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.386454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.386996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.387054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.387332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.387354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.387375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.387395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.388768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.390319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.390372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.391910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.392351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.392387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.396373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.396431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.398161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.398216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.398495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.398517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.398537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.398557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.400117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.400171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.400783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.402190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.402472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.402494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.404497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.404897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.404963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.405360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.405648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.405671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.405695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.405715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.405776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.407007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.408556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.408608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.408888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.408910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.412189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.412247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.413791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.414399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.414905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.414935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.414959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.414985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.415391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.415787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.415843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.417203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.417547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.417570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.419127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.420530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.421757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.421810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.422095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.422118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.422138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.422158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.423712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.423766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.424881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.425294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.425707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.425744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.429321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.430876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.430932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.431901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.432187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.432210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.432230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.132 [2024-07-15 20:49:42.432250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.432311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.433547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.435088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.435140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.435420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.435442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.437941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.438001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.439485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.440711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.440995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.441018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.441038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.441058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.442617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.444017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.444071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.445378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.445756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.445778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.447436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.447836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.448282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.448337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.448794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.448821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.448843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.448867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.450033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.450087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.451321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.452869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.453153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.453177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.456366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.457920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.457978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.459030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.459434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.459459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.459481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.459503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.459562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.459962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.460361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.460412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.460781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.460807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.463776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.463834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.465172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.466411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.466691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.466714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.466734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.466754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.468311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.469843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.469905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.470299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.470753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.470780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.472972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.473075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.473351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.473373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.474939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.474992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.476220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.477535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.477893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.477915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.479637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.480562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.480966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.481023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.481384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:50.133 [2024-07-15 20:49:42.482969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:51.076 00:35:51.076 Latency(us) 00:35:51.076 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:51.076 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:51.076 Verification LBA range: start 0x0 length 0x100 00:35:51.076 crypto_ram : 6.06 42.22 2.64 0.00 0.00 2946904.60 317308.22 2348810.24 00:35:51.076 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:51.076 Verification LBA range: start 0x100 length 0x100 00:35:51.076 crypto_ram : 6.07 34.29 2.14 0.00 0.00 3422686.64 166860.35 3005310.00 00:35:51.076 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:51.076 Verification LBA range: start 0x0 length 0x100 00:35:51.076 crypto_ram1 : 6.07 42.21 2.64 0.00 0.00 2848443.88 317308.22 2159154.75 00:35:51.076 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:51.076 Verification LBA range: start 0x100 length 0x100 00:35:51.076 crypto_ram1 : 6.15 40.03 2.50 0.00 0.00 2970944.95 123093.70 2771887.86 00:35:51.076 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:51.076 Verification LBA range: start 0x0 length 0x100 00:35:51.076 crypto_ram2 : 5.62 269.97 16.87 0.00 0.00 424993.86 85709.69 583555.34 00:35:51.076 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:51.076 Verification LBA range: start 0x100 length 0x100 00:35:51.076 crypto_ram2 : 5.68 221.25 13.83 0.00 0.00 512410.05 72488.51 682030.30 00:35:51.076 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:51.076 Verification LBA range: start 0x0 length 0x100 00:35:51.076 crypto_ram3 : 5.72 280.13 17.51 0.00 0.00 397001.97 35788.35 443137.34 00:35:51.076 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:51.076 Verification LBA range: start 0x100 length 0x100 00:35:51.076 crypto_ram3 : 5.79 232.81 14.55 0.00 0.00 471458.76 20401.64 583555.34 00:35:51.076 =================================================================================================================== 00:35:51.076 Total : 1162.90 72.68 0.00 0.00 820259.75 20401.64 3005310.00 00:35:51.336 00:35:51.336 real 0m9.326s 00:35:51.336 user 0m17.664s 00:35:51.336 sys 0m0.486s 00:35:51.336 20:49:43 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:51.336 20:49:43 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:35:51.336 ************************************ 00:35:51.336 END TEST bdev_verify_big_io 00:35:51.336 ************************************ 00:35:51.336 20:49:43 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:51.336 20:49:43 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:51.336 20:49:43 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:51.336 20:49:43 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:51.336 20:49:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:51.336 ************************************ 00:35:51.336 START TEST bdev_write_zeroes 00:35:51.336 ************************************ 00:35:51.336 20:49:43 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:51.594 [2024-07-15 20:49:43.758699] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:35:51.594 [2024-07-15 20:49:43.758759] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1562568 ] 00:35:51.594 [2024-07-15 20:49:43.887089] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:51.852 [2024-07-15 20:49:43.984273] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:51.852 [2024-07-15 20:49:44.005556] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:51.852 [2024-07-15 20:49:44.013585] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:51.852 [2024-07-15 20:49:44.021603] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:51.852 [2024-07-15 20:49:44.132066] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:54.383 [2024-07-15 20:49:46.337440] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:54.383 [2024-07-15 20:49:46.337504] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:54.383 [2024-07-15 20:49:46.337519] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:54.383 [2024-07-15 20:49:46.345458] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:54.383 [2024-07-15 20:49:46.345476] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:54.383 [2024-07-15 20:49:46.345488] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:54.383 [2024-07-15 20:49:46.353478] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:54.383 [2024-07-15 20:49:46.353495] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:54.383 [2024-07-15 20:49:46.353506] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:54.383 [2024-07-15 20:49:46.361498] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:54.383 [2024-07-15 20:49:46.361515] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:54.383 [2024-07-15 20:49:46.361526] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:54.383 Running I/O for 1 seconds... 00:35:55.320 00:35:55.320 Latency(us) 00:35:55.320 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:55.320 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:55.320 crypto_ram : 1.02 2013.67 7.87 0.00 0.00 63113.93 5641.79 76591.64 00:35:55.320 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:55.320 crypto_ram1 : 1.03 2026.78 7.92 0.00 0.00 62417.75 5613.30 70664.90 00:35:55.320 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:55.320 crypto_ram2 : 1.02 15558.48 60.78 0.00 0.00 8109.47 2436.23 10713.71 00:35:55.320 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:55.320 crypto_ram3 : 1.02 15537.05 60.69 0.00 0.00 8084.86 2450.48 8434.20 00:35:55.320 =================================================================================================================== 00:35:55.320 Total : 35135.98 137.25 0.00 0.00 14409.62 2436.23 76591.64 00:35:55.579 00:35:55.580 real 0m4.163s 00:35:55.580 user 0m3.746s 00:35:55.580 sys 0m0.373s 00:35:55.580 20:49:47 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:55.580 20:49:47 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:35:55.580 ************************************ 00:35:55.580 END TEST bdev_write_zeroes 00:35:55.580 ************************************ 00:35:55.580 20:49:47 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:55.580 20:49:47 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:55.580 20:49:47 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:55.580 20:49:47 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:55.580 20:49:47 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:55.580 ************************************ 00:35:55.580 START TEST bdev_json_nonenclosed 00:35:55.580 ************************************ 00:35:55.580 20:49:47 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:55.839 [2024-07-15 20:49:48.005915] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:35:55.839 [2024-07-15 20:49:48.006000] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1563110 ] 00:35:55.839 [2024-07-15 20:49:48.147802] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:56.098 [2024-07-15 20:49:48.249720] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:56.098 [2024-07-15 20:49:48.249786] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:35:56.098 [2024-07-15 20:49:48.249807] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:56.098 [2024-07-15 20:49:48.249820] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:56.098 00:35:56.098 real 0m0.410s 00:35:56.098 user 0m0.249s 00:35:56.098 sys 0m0.157s 00:35:56.098 20:49:48 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:35:56.098 20:49:48 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:56.098 20:49:48 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:35:56.098 ************************************ 00:35:56.098 END TEST bdev_json_nonenclosed 00:35:56.098 ************************************ 00:35:56.098 20:49:48 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:56.098 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:35:56.098 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:56.098 20:49:48 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:56.098 20:49:48 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:56.098 20:49:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:56.098 ************************************ 00:35:56.098 START TEST bdev_json_nonarray 00:35:56.098 ************************************ 00:35:56.098 20:49:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:56.356 [2024-07-15 20:49:48.494705] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:35:56.356 [2024-07-15 20:49:48.494770] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1563216 ] 00:35:56.356 [2024-07-15 20:49:48.621413] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:56.356 [2024-07-15 20:49:48.717848] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:56.356 [2024-07-15 20:49:48.717932] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:35:56.356 [2024-07-15 20:49:48.717953] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:56.356 [2024-07-15 20:49:48.717966] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:56.615 00:35:56.615 real 0m0.389s 00:35:56.615 user 0m0.221s 00:35:56.615 sys 0m0.166s 00:35:56.615 20:49:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:35:56.615 20:49:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:56.615 20:49:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:35:56.615 ************************************ 00:35:56.615 END TEST bdev_json_nonarray 00:35:56.615 ************************************ 00:35:56.615 20:49:48 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:35:56.615 20:49:48 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:35:56.615 00:35:56.615 real 1m13.629s 00:35:56.615 user 2m43.171s 00:35:56.615 sys 0m9.431s 00:35:56.615 20:49:48 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:56.615 20:49:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:56.615 ************************************ 00:35:56.615 END TEST blockdev_crypto_qat 00:35:56.615 ************************************ 00:35:56.615 20:49:48 -- common/autotest_common.sh@1142 -- # return 0 00:35:56.615 20:49:48 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:56.615 20:49:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:35:56.615 20:49:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:56.615 20:49:48 -- common/autotest_common.sh@10 -- # set +x 00:35:56.615 ************************************ 00:35:56.615 START TEST chaining 00:35:56.615 ************************************ 00:35:56.615 20:49:48 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:56.874 * Looking for test storage... 00:35:56.874 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:56.874 20:49:49 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@7 -- # uname -s 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:35:56.874 20:49:49 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:56.874 20:49:49 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:35:56.874 20:49:49 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:56.874 20:49:49 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:56.874 20:49:49 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:56.874 20:49:49 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:56.874 20:49:49 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:56.874 20:49:49 chaining -- paths/export.sh@5 -- # export PATH 00:35:56.875 20:49:49 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@47 -- # : 0 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:35:56.875 20:49:49 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:35:56.875 20:49:49 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:35:56.875 20:49:49 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:35:56.875 20:49:49 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:35:56.875 20:49:49 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:35:56.875 20:49:49 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:56.875 20:49:49 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:56.875 20:49:49 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:56.875 20:49:49 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:56.875 20:49:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@296 -- # e810=() 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@297 -- # x722=() 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@298 -- # mlx=() 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:36:04.993 20:49:56 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@336 -- # return 1 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:36:04.994 WARNING: No supported devices were found, fallback requested for tcp test 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:36:04.994 Cannot find device "nvmf_tgt_br" 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@155 -- # true 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:36:04.994 Cannot find device "nvmf_tgt_br2" 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@156 -- # true 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:36:04.994 Cannot find device "nvmf_tgt_br" 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@158 -- # true 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:36:04.994 Cannot find device "nvmf_tgt_br2" 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@159 -- # true 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:36:04.994 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@162 -- # true 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:36:04.994 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@163 -- # true 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:36:04.994 20:49:56 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:36:04.994 20:49:57 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:36:05.253 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:36:05.253 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.107 ms 00:36:05.253 00:36:05.253 --- 10.0.0.2 ping statistics --- 00:36:05.253 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:05.253 rtt min/avg/max/mdev = 0.107/0.107/0.107/0.000 ms 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:36:05.253 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:36:05.253 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.083 ms 00:36:05.253 00:36:05.253 --- 10.0.0.3 ping statistics --- 00:36:05.253 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:05.253 rtt min/avg/max/mdev = 0.083/0.083/0.083/0.000 ms 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:36:05.253 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:36:05.253 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.038 ms 00:36:05.253 00:36:05.253 --- 10.0.0.1 ping statistics --- 00:36:05.253 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:05.253 rtt min/avg/max/mdev = 0.038/0.038/0.038/0.000 ms 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@433 -- # return 0 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:36:05.253 20:49:57 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:36:05.253 20:49:57 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:36:05.253 20:49:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@481 -- # nvmfpid=1566978 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:36:05.253 20:49:57 chaining -- nvmf/common.sh@482 -- # waitforlisten 1566978 00:36:05.253 20:49:57 chaining -- common/autotest_common.sh@829 -- # '[' -z 1566978 ']' 00:36:05.253 20:49:57 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:05.253 20:49:57 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:05.253 20:49:57 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:05.253 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:05.511 20:49:57 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:05.511 20:49:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:05.511 [2024-07-15 20:49:57.692528] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:05.511 [2024-07-15 20:49:57.692597] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:36:05.511 [2024-07-15 20:49:57.835487] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:05.769 [2024-07-15 20:49:57.955179] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:36:05.769 [2024-07-15 20:49:57.955232] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:36:05.769 [2024-07-15 20:49:57.955250] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:36:05.769 [2024-07-15 20:49:57.955266] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:36:05.769 [2024-07-15 20:49:57.955280] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:36:05.769 [2024-07-15 20:49:57.955314] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:06.335 20:49:58 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:06.335 20:49:58 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:06.335 20:49:58 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:36:06.335 20:49:58 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:06.335 20:49:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:06.335 20:49:58 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:36:06.335 20:49:58 chaining -- bdev/chaining.sh@69 -- # mktemp 00:36:06.335 20:49:58 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.QtWbz1jho9 00:36:06.335 20:49:58 chaining -- bdev/chaining.sh@69 -- # mktemp 00:36:06.335 20:49:58 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.zWApcpibKF 00:36:06.335 20:49:58 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:36:06.335 20:49:58 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:36:06.335 20:49:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:06.335 20:49:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:06.615 malloc0 00:36:06.615 true 00:36:06.615 true 00:36:06.615 [2024-07-15 20:49:58.746598] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:06.615 crypto0 00:36:06.615 [2024-07-15 20:49:58.754629] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:06.615 crypto1 00:36:06.615 [2024-07-15 20:49:58.762788] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:36:06.615 [2024-07-15 20:49:58.779082] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:36:06.615 20:49:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:06.615 20:49:58 chaining -- bdev/chaining.sh@85 -- # update_stats 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:06.616 20:49:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.QtWbz1jho9 bs=1K count=64 00:36:06.616 64+0 records in 00:36:06.616 64+0 records out 00:36:06.616 65536 bytes (66 kB, 64 KiB) copied, 0.00105854 s, 61.9 MB/s 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.QtWbz1jho9 --ob Nvme0n1 --bs 65536 --count 1 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@25 -- # local config 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:06.616 20:49:58 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:06.616 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:06.873 20:49:59 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:06.873 "subsystems": [ 00:36:06.873 { 00:36:06.873 "subsystem": "bdev", 00:36:06.873 "config": [ 00:36:06.873 { 00:36:06.873 "method": "bdev_nvme_attach_controller", 00:36:06.873 "params": { 00:36:06.873 "trtype": "tcp", 00:36:06.873 "adrfam": "IPv4", 00:36:06.873 "name": "Nvme0", 00:36:06.873 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:06.873 "traddr": "10.0.0.2", 00:36:06.873 "trsvcid": "4420" 00:36:06.873 } 00:36:06.873 }, 00:36:06.873 { 00:36:06.873 "method": "bdev_set_options", 00:36:06.873 "params": { 00:36:06.873 "bdev_auto_examine": false 00:36:06.873 } 00:36:06.873 } 00:36:06.873 ] 00:36:06.873 } 00:36:06.873 ] 00:36:06.873 }' 00:36:06.873 20:49:59 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.QtWbz1jho9 --ob Nvme0n1 --bs 65536 --count 1 00:36:06.873 20:49:59 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:06.873 "subsystems": [ 00:36:06.873 { 00:36:06.873 "subsystem": "bdev", 00:36:06.873 "config": [ 00:36:06.873 { 00:36:06.873 "method": "bdev_nvme_attach_controller", 00:36:06.873 "params": { 00:36:06.873 "trtype": "tcp", 00:36:06.873 "adrfam": "IPv4", 00:36:06.873 "name": "Nvme0", 00:36:06.873 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:06.873 "traddr": "10.0.0.2", 00:36:06.873 "trsvcid": "4420" 00:36:06.873 } 00:36:06.873 }, 00:36:06.873 { 00:36:06.873 "method": "bdev_set_options", 00:36:06.873 "params": { 00:36:06.873 "bdev_auto_examine": false 00:36:06.873 } 00:36:06.873 } 00:36:06.873 ] 00:36:06.873 } 00:36:06.873 ] 00:36:06.873 }' 00:36:06.873 [2024-07-15 20:49:59.069664] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:06.873 [2024-07-15 20:49:59.069731] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1567202 ] 00:36:06.873 [2024-07-15 20:49:59.202635] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:07.131 [2024-07-15 20:49:59.307215] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:07.388  Copying: 64/64 [kB] (average 12 MBps) 00:36:07.388 00:36:07.388 20:49:59 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:36:07.388 20:49:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:07.388 20:49:59 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:07.388 20:49:59 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:07.389 20:49:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:07.389 20:49:59 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:07.389 20:49:59 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:07.389 20:49:59 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:07.389 20:49:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:07.389 20:49:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:07.389 20:49:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:07.646 20:49:59 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:36:07.646 20:49:59 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:36:07.646 20:49:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:07.646 20:49:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:07.646 20:49:59 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:07.646 20:49:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@96 -- # update_stats 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:07.647 20:49:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:07.647 20:49:59 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:07.647 20:50:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:07.647 20:50:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:07.647 20:50:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:07.647 20:50:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:07.647 20:50:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:07.905 20:50:00 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:07.906 20:50:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:07.906 20:50:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:07.906 20:50:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.zWApcpibKF --ib Nvme0n1 --bs 65536 --count 1 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@25 -- # local config 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:07.906 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:07.906 "subsystems": [ 00:36:07.906 { 00:36:07.906 "subsystem": "bdev", 00:36:07.906 "config": [ 00:36:07.906 { 00:36:07.906 "method": "bdev_nvme_attach_controller", 00:36:07.906 "params": { 00:36:07.906 "trtype": "tcp", 00:36:07.906 "adrfam": "IPv4", 00:36:07.906 "name": "Nvme0", 00:36:07.906 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:07.906 "traddr": "10.0.0.2", 00:36:07.906 "trsvcid": "4420" 00:36:07.906 } 00:36:07.906 }, 00:36:07.906 { 00:36:07.906 "method": "bdev_set_options", 00:36:07.906 "params": { 00:36:07.906 "bdev_auto_examine": false 00:36:07.906 } 00:36:07.906 } 00:36:07.906 ] 00:36:07.906 } 00:36:07.906 ] 00:36:07.906 }' 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.zWApcpibKF --ib Nvme0n1 --bs 65536 --count 1 00:36:07.906 20:50:00 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:07.906 "subsystems": [ 00:36:07.906 { 00:36:07.906 "subsystem": "bdev", 00:36:07.906 "config": [ 00:36:07.906 { 00:36:07.906 "method": "bdev_nvme_attach_controller", 00:36:07.906 "params": { 00:36:07.906 "trtype": "tcp", 00:36:07.906 "adrfam": "IPv4", 00:36:07.906 "name": "Nvme0", 00:36:07.906 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:07.906 "traddr": "10.0.0.2", 00:36:07.906 "trsvcid": "4420" 00:36:07.906 } 00:36:07.906 }, 00:36:07.906 { 00:36:07.906 "method": "bdev_set_options", 00:36:07.906 "params": { 00:36:07.906 "bdev_auto_examine": false 00:36:07.906 } 00:36:07.906 } 00:36:07.906 ] 00:36:07.906 } 00:36:07.906 ] 00:36:07.906 }' 00:36:07.906 [2024-07-15 20:50:00.194048] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:07.906 [2024-07-15 20:50:00.194113] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1567412 ] 00:36:08.165 [2024-07-15 20:50:00.323020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:08.165 [2024-07-15 20:50:00.419994] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:08.683  Copying: 64/64 [kB] (average 20 MBps) 00:36:08.684 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:08.684 20:50:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:08.684 20:50:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:08.684 20:50:01 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:36:08.684 20:50:01 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.QtWbz1jho9 /tmp/tmp.zWApcpibKF 00:36:08.684 20:50:01 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:36:08.684 20:50:01 chaining -- bdev/chaining.sh@25 -- # local config 00:36:08.684 20:50:01 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:08.684 20:50:01 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:08.684 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:08.943 20:50:01 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:08.943 "subsystems": [ 00:36:08.943 { 00:36:08.943 "subsystem": "bdev", 00:36:08.943 "config": [ 00:36:08.943 { 00:36:08.943 "method": "bdev_nvme_attach_controller", 00:36:08.943 "params": { 00:36:08.943 "trtype": "tcp", 00:36:08.943 "adrfam": "IPv4", 00:36:08.943 "name": "Nvme0", 00:36:08.943 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:08.943 "traddr": "10.0.0.2", 00:36:08.943 "trsvcid": "4420" 00:36:08.943 } 00:36:08.943 }, 00:36:08.943 { 00:36:08.943 "method": "bdev_set_options", 00:36:08.943 "params": { 00:36:08.943 "bdev_auto_examine": false 00:36:08.943 } 00:36:08.943 } 00:36:08.943 ] 00:36:08.943 } 00:36:08.943 ] 00:36:08.943 }' 00:36:08.943 20:50:01 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:36:08.943 20:50:01 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:08.943 "subsystems": [ 00:36:08.943 { 00:36:08.943 "subsystem": "bdev", 00:36:08.943 "config": [ 00:36:08.943 { 00:36:08.943 "method": "bdev_nvme_attach_controller", 00:36:08.943 "params": { 00:36:08.943 "trtype": "tcp", 00:36:08.943 "adrfam": "IPv4", 00:36:08.943 "name": "Nvme0", 00:36:08.943 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:08.943 "traddr": "10.0.0.2", 00:36:08.943 "trsvcid": "4420" 00:36:08.943 } 00:36:08.943 }, 00:36:08.943 { 00:36:08.943 "method": "bdev_set_options", 00:36:08.943 "params": { 00:36:08.943 "bdev_auto_examine": false 00:36:08.943 } 00:36:08.943 } 00:36:08.943 ] 00:36:08.943 } 00:36:08.943 ] 00:36:08.943 }' 00:36:08.943 [2024-07-15 20:50:01.131204] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:08.943 [2024-07-15 20:50:01.131280] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1567516 ] 00:36:08.943 [2024-07-15 20:50:01.264010] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:09.201 [2024-07-15 20:50:01.368338] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:09.459  Copying: 64/64 [kB] (average 31 MBps) 00:36:09.459 00:36:09.459 20:50:01 chaining -- bdev/chaining.sh@106 -- # update_stats 00:36:09.459 20:50:01 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:09.459 20:50:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:09.459 20:50:01 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:09.459 20:50:01 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:09.459 20:50:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:09.459 20:50:01 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:09.459 20:50:01 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:09.459 20:50:01 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:09.459 20:50:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:09.459 20:50:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:09.459 20:50:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:09.717 20:50:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:09.717 20:50:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:09.717 20:50:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:09.717 20:50:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:09.717 20:50:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:09.717 20:50:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:09.717 20:50:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:09.718 20:50:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:09.718 20:50:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:09.718 20:50:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.QtWbz1jho9 --ob Nvme0n1 --bs 4096 --count 16 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@25 -- # local config 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:09.718 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:09.718 20:50:01 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:09.718 20:50:02 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:09.718 "subsystems": [ 00:36:09.718 { 00:36:09.718 "subsystem": "bdev", 00:36:09.718 "config": [ 00:36:09.718 { 00:36:09.718 "method": "bdev_nvme_attach_controller", 00:36:09.718 "params": { 00:36:09.718 "trtype": "tcp", 00:36:09.718 "adrfam": "IPv4", 00:36:09.718 "name": "Nvme0", 00:36:09.718 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:09.718 "traddr": "10.0.0.2", 00:36:09.718 "trsvcid": "4420" 00:36:09.718 } 00:36:09.718 }, 00:36:09.718 { 00:36:09.718 "method": "bdev_set_options", 00:36:09.718 "params": { 00:36:09.718 "bdev_auto_examine": false 00:36:09.718 } 00:36:09.718 } 00:36:09.718 ] 00:36:09.718 } 00:36:09.718 ] 00:36:09.718 }' 00:36:09.718 20:50:02 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:09.718 "subsystems": [ 00:36:09.718 { 00:36:09.718 "subsystem": "bdev", 00:36:09.718 "config": [ 00:36:09.718 { 00:36:09.718 "method": "bdev_nvme_attach_controller", 00:36:09.718 "params": { 00:36:09.718 "trtype": "tcp", 00:36:09.718 "adrfam": "IPv4", 00:36:09.718 "name": "Nvme0", 00:36:09.718 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:09.718 "traddr": "10.0.0.2", 00:36:09.718 "trsvcid": "4420" 00:36:09.718 } 00:36:09.718 }, 00:36:09.718 { 00:36:09.718 "method": "bdev_set_options", 00:36:09.718 "params": { 00:36:09.718 "bdev_auto_examine": false 00:36:09.718 } 00:36:09.718 } 00:36:09.718 ] 00:36:09.718 } 00:36:09.718 ] 00:36:09.718 }' 00:36:09.718 20:50:02 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.QtWbz1jho9 --ob Nvme0n1 --bs 4096 --count 16 00:36:09.718 [2024-07-15 20:50:02.080659] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:09.718 [2024-07-15 20:50:02.080727] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1567645 ] 00:36:09.976 [2024-07-15 20:50:02.210673] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:09.976 [2024-07-15 20:50:02.317101] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:10.492  Copying: 64/64 [kB] (average 12 MBps) 00:36:10.492 00:36:10.492 20:50:02 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:36:10.492 20:50:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:10.492 20:50:02 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:10.492 20:50:02 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:10.492 20:50:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:10.492 20:50:02 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:10.493 20:50:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:10.493 20:50:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:10.493 20:50:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:10.493 20:50:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:10.493 20:50:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:10.493 20:50:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:10.493 20:50:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:10.493 20:50:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:10.493 20:50:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:10.493 20:50:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:10.750 20:50:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:10.750 20:50:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:10.750 20:50:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@114 -- # update_stats 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:10.750 20:50:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:10.750 20:50:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:10.750 20:50:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:10.750 20:50:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:10.750 20:50:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:10.750 20:50:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:10.750 20:50:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:10.750 20:50:03 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:36:10.750 20:50:03 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:10.750 20:50:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:10.750 20:50:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:10.750 20:50:03 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:10.750 20:50:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:10.750 20:50:03 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:10.750 20:50:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:10.750 20:50:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:10.750 20:50:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:10.750 20:50:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:10.751 20:50:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:10.751 20:50:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:10.751 20:50:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:10.751 20:50:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@117 -- # : 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.zWApcpibKF --ib Nvme0n1 --bs 4096 --count 16 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@25 -- # local config 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:10.751 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:10.751 20:50:03 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:11.008 20:50:03 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:11.008 "subsystems": [ 00:36:11.008 { 00:36:11.008 "subsystem": "bdev", 00:36:11.008 "config": [ 00:36:11.008 { 00:36:11.008 "method": "bdev_nvme_attach_controller", 00:36:11.008 "params": { 00:36:11.008 "trtype": "tcp", 00:36:11.008 "adrfam": "IPv4", 00:36:11.008 "name": "Nvme0", 00:36:11.008 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:11.008 "traddr": "10.0.0.2", 00:36:11.008 "trsvcid": "4420" 00:36:11.008 } 00:36:11.009 }, 00:36:11.009 { 00:36:11.009 "method": "bdev_set_options", 00:36:11.009 "params": { 00:36:11.009 "bdev_auto_examine": false 00:36:11.009 } 00:36:11.009 } 00:36:11.009 ] 00:36:11.009 } 00:36:11.009 ] 00:36:11.009 }' 00:36:11.009 20:50:03 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.zWApcpibKF --ib Nvme0n1 --bs 4096 --count 16 00:36:11.009 20:50:03 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:11.009 "subsystems": [ 00:36:11.009 { 00:36:11.009 "subsystem": "bdev", 00:36:11.009 "config": [ 00:36:11.009 { 00:36:11.009 "method": "bdev_nvme_attach_controller", 00:36:11.009 "params": { 00:36:11.009 "trtype": "tcp", 00:36:11.009 "adrfam": "IPv4", 00:36:11.009 "name": "Nvme0", 00:36:11.009 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:11.009 "traddr": "10.0.0.2", 00:36:11.009 "trsvcid": "4420" 00:36:11.009 } 00:36:11.009 }, 00:36:11.009 { 00:36:11.009 "method": "bdev_set_options", 00:36:11.009 "params": { 00:36:11.009 "bdev_auto_examine": false 00:36:11.009 } 00:36:11.009 } 00:36:11.009 ] 00:36:11.009 } 00:36:11.009 ] 00:36:11.009 }' 00:36:11.009 [2024-07-15 20:50:03.202988] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:11.009 [2024-07-15 20:50:03.203056] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1567851 ] 00:36:11.009 [2024-07-15 20:50:03.332179] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:11.266 [2024-07-15 20:50:03.433716] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:11.780  Copying: 64/64 [kB] (average 1306 kBps) 00:36:11.780 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:11.780 20:50:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:11.780 20:50:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:11.780 20:50:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:11.780 20:50:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:11.780 20:50:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:11.780 20:50:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:11.780 20:50:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:11.780 20:50:04 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:11.780 20:50:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:11.780 20:50:04 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:11.780 20:50:04 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:11.780 20:50:04 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:11.780 20:50:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:11.781 20:50:04 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:11.781 20:50:04 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:36:11.781 20:50:04 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.QtWbz1jho9 /tmp/tmp.zWApcpibKF 00:36:11.781 20:50:04 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:36:11.781 20:50:04 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:36:11.781 20:50:04 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.QtWbz1jho9 /tmp/tmp.zWApcpibKF 00:36:11.781 20:50:04 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:36:11.781 20:50:04 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:36:11.781 20:50:04 chaining -- nvmf/common.sh@117 -- # sync 00:36:11.781 20:50:04 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:36:11.781 20:50:04 chaining -- nvmf/common.sh@120 -- # set +e 00:36:11.781 20:50:04 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:36:11.781 20:50:04 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:36:11.781 rmmod nvme_tcp 00:36:11.781 rmmod nvme_fabrics 00:36:12.039 rmmod nvme_keyring 00:36:12.039 20:50:04 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:36:12.039 20:50:04 chaining -- nvmf/common.sh@124 -- # set -e 00:36:12.039 20:50:04 chaining -- nvmf/common.sh@125 -- # return 0 00:36:12.039 20:50:04 chaining -- nvmf/common.sh@489 -- # '[' -n 1566978 ']' 00:36:12.039 20:50:04 chaining -- nvmf/common.sh@490 -- # killprocess 1566978 00:36:12.039 20:50:04 chaining -- common/autotest_common.sh@948 -- # '[' -z 1566978 ']' 00:36:12.039 20:50:04 chaining -- common/autotest_common.sh@952 -- # kill -0 1566978 00:36:12.039 20:50:04 chaining -- common/autotest_common.sh@953 -- # uname 00:36:12.039 20:50:04 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:12.039 20:50:04 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1566978 00:36:12.039 20:50:04 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:36:12.039 20:50:04 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:36:12.039 20:50:04 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1566978' 00:36:12.039 killing process with pid 1566978 00:36:12.039 20:50:04 chaining -- common/autotest_common.sh@967 -- # kill 1566978 00:36:12.039 20:50:04 chaining -- common/autotest_common.sh@972 -- # wait 1566978 00:36:12.331 20:50:04 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:36:12.331 20:50:04 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:36:12.331 20:50:04 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:36:12.331 20:50:04 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:36:12.331 20:50:04 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:36:12.331 20:50:04 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:12.331 20:50:04 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:12.331 20:50:04 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:12.331 20:50:04 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:36:12.331 20:50:04 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:36:12.331 20:50:04 chaining -- bdev/chaining.sh@132 -- # bperfpid=1568062 00:36:12.331 20:50:04 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:12.331 20:50:04 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1568062 00:36:12.331 20:50:04 chaining -- common/autotest_common.sh@829 -- # '[' -z 1568062 ']' 00:36:12.331 20:50:04 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:12.331 20:50:04 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:12.331 20:50:04 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:12.331 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:12.331 20:50:04 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:12.331 20:50:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:12.331 [2024-07-15 20:50:04.675103] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:12.331 [2024-07-15 20:50:04.675171] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1568062 ] 00:36:12.588 [2024-07-15 20:50:04.804591] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:12.588 [2024-07-15 20:50:04.916596] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:13.522 20:50:05 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:13.522 20:50:05 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:13.522 20:50:05 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:36:13.522 20:50:05 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:13.522 20:50:05 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:13.522 malloc0 00:36:13.522 true 00:36:13.522 true 00:36:13.522 [2024-07-15 20:50:05.780899] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:13.522 crypto0 00:36:13.522 [2024-07-15 20:50:05.788932] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:13.522 crypto1 00:36:13.522 20:50:05 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:13.522 20:50:05 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:36:13.780 Running I/O for 5 seconds... 00:36:19.045 00:36:19.045 Latency(us) 00:36:19.045 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:19.045 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:36:19.045 Verification LBA range: start 0x0 length 0x2000 00:36:19.045 crypto1 : 5.01 11442.95 44.70 0.00 0.00 22310.64 6468.12 14303.94 00:36:19.045 =================================================================================================================== 00:36:19.045 Total : 11442.95 44.70 0.00 0.00 22310.64 6468.12 14303.94 00:36:19.045 0 00:36:19.045 20:50:10 chaining -- bdev/chaining.sh@146 -- # killprocess 1568062 00:36:19.045 20:50:10 chaining -- common/autotest_common.sh@948 -- # '[' -z 1568062 ']' 00:36:19.045 20:50:10 chaining -- common/autotest_common.sh@952 -- # kill -0 1568062 00:36:19.045 20:50:10 chaining -- common/autotest_common.sh@953 -- # uname 00:36:19.045 20:50:10 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:19.045 20:50:10 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1568062 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1568062' 00:36:19.045 killing process with pid 1568062 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@967 -- # kill 1568062 00:36:19.045 Received shutdown signal, test time was about 5.000000 seconds 00:36:19.045 00:36:19.045 Latency(us) 00:36:19.045 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:19.045 =================================================================================================================== 00:36:19.045 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@972 -- # wait 1568062 00:36:19.045 20:50:11 chaining -- bdev/chaining.sh@152 -- # bperfpid=1568943 00:36:19.045 20:50:11 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:19.045 20:50:11 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1568943 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@829 -- # '[' -z 1568943 ']' 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:19.045 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:19.045 20:50:11 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:19.045 [2024-07-15 20:50:11.298565] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:19.045 [2024-07-15 20:50:11.298636] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1568943 ] 00:36:19.303 [2024-07-15 20:50:11.428878] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:19.303 [2024-07-15 20:50:11.534897] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:19.868 20:50:12 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:19.868 20:50:12 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:19.868 20:50:12 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:36:19.868 20:50:12 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:19.868 20:50:12 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:20.126 malloc0 00:36:20.126 true 00:36:20.126 true 00:36:20.126 [2024-07-15 20:50:12.362982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:36:20.126 [2024-07-15 20:50:12.363029] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:20.126 [2024-07-15 20:50:12.363050] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x196a730 00:36:20.126 [2024-07-15 20:50:12.363063] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:20.126 [2024-07-15 20:50:12.364139] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:20.126 [2024-07-15 20:50:12.364163] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:36:20.126 pt0 00:36:20.126 [2024-07-15 20:50:12.371013] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:20.126 crypto0 00:36:20.126 [2024-07-15 20:50:12.379033] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:20.126 crypto1 00:36:20.126 20:50:12 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:20.126 20:50:12 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:36:20.126 Running I/O for 5 seconds... 00:36:25.388 00:36:25.388 Latency(us) 00:36:25.388 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:25.388 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:36:25.388 Verification LBA range: start 0x0 length 0x2000 00:36:25.388 crypto1 : 5.01 9086.93 35.50 0.00 0.00 28096.33 6553.60 16868.40 00:36:25.388 =================================================================================================================== 00:36:25.388 Total : 9086.93 35.50 0.00 0.00 28096.33 6553.60 16868.40 00:36:25.388 0 00:36:25.388 20:50:17 chaining -- bdev/chaining.sh@167 -- # killprocess 1568943 00:36:25.388 20:50:17 chaining -- common/autotest_common.sh@948 -- # '[' -z 1568943 ']' 00:36:25.388 20:50:17 chaining -- common/autotest_common.sh@952 -- # kill -0 1568943 00:36:25.388 20:50:17 chaining -- common/autotest_common.sh@953 -- # uname 00:36:25.388 20:50:17 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:25.388 20:50:17 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1568943 00:36:25.388 20:50:17 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:25.388 20:50:17 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:25.388 20:50:17 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1568943' 00:36:25.388 killing process with pid 1568943 00:36:25.388 20:50:17 chaining -- common/autotest_common.sh@967 -- # kill 1568943 00:36:25.388 Received shutdown signal, test time was about 5.000000 seconds 00:36:25.388 00:36:25.388 Latency(us) 00:36:25.388 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:25.388 =================================================================================================================== 00:36:25.388 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:25.388 20:50:17 chaining -- common/autotest_common.sh@972 -- # wait 1568943 00:36:25.668 20:50:17 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:36:25.668 20:50:17 chaining -- bdev/chaining.sh@170 -- # killprocess 1568943 00:36:25.668 20:50:17 chaining -- common/autotest_common.sh@948 -- # '[' -z 1568943 ']' 00:36:25.668 20:50:17 chaining -- common/autotest_common.sh@952 -- # kill -0 1568943 00:36:25.668 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (1568943) - No such process 00:36:25.668 20:50:17 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 1568943 is not found' 00:36:25.668 Process with pid 1568943 is not found 00:36:25.668 20:50:17 chaining -- bdev/chaining.sh@171 -- # wait 1568943 00:36:25.668 20:50:17 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:25.668 20:50:17 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:25.668 20:50:17 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:36:25.668 20:50:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@296 -- # e810=() 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@297 -- # x722=() 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@298 -- # mlx=() 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:36:25.668 20:50:17 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@336 -- # return 1 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:36:25.669 WARNING: No supported devices were found, fallback requested for tcp test 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:36:25.669 Cannot find device "nvmf_tgt_br" 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@155 -- # true 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:36:25.669 Cannot find device "nvmf_tgt_br2" 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@156 -- # true 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:36:25.669 Cannot find device "nvmf_tgt_br" 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@158 -- # true 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:36:25.669 Cannot find device "nvmf_tgt_br2" 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@159 -- # true 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:36:25.669 20:50:17 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:36:25.669 20:50:18 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:36:25.669 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:25.669 20:50:18 chaining -- nvmf/common.sh@162 -- # true 00:36:25.669 20:50:18 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:36:25.669 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:25.669 20:50:18 chaining -- nvmf/common.sh@163 -- # true 00:36:25.669 20:50:18 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:36:25.669 20:50:18 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:36:25.928 20:50:18 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:36:26.186 20:50:18 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:36:26.186 20:50:18 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:36:26.186 20:50:18 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:36:26.446 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:36:26.446 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.112 ms 00:36:26.446 00:36:26.446 --- 10.0.0.2 ping statistics --- 00:36:26.446 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:26.446 rtt min/avg/max/mdev = 0.112/0.112/0.112/0.000 ms 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:36:26.446 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:36:26.446 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.080 ms 00:36:26.446 00:36:26.446 --- 10.0.0.3 ping statistics --- 00:36:26.446 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:26.446 rtt min/avg/max/mdev = 0.080/0.080/0.080/0.000 ms 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:36:26.446 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:36:26.446 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.048 ms 00:36:26.446 00:36:26.446 --- 10.0.0.1 ping statistics --- 00:36:26.446 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:26.446 rtt min/avg/max/mdev = 0.048/0.048/0.048/0.000 ms 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@433 -- # return 0 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:36:26.446 20:50:18 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:36:26.446 20:50:18 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:36:26.446 20:50:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@481 -- # nvmfpid=1570084 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:36:26.446 20:50:18 chaining -- nvmf/common.sh@482 -- # waitforlisten 1570084 00:36:26.446 20:50:18 chaining -- common/autotest_common.sh@829 -- # '[' -z 1570084 ']' 00:36:26.446 20:50:18 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:26.446 20:50:18 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:26.446 20:50:18 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:26.446 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:26.446 20:50:18 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:26.446 20:50:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:26.446 [2024-07-15 20:50:18.730935] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:26.446 [2024-07-15 20:50:18.731007] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:36:26.705 [2024-07-15 20:50:18.873399] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:26.705 [2024-07-15 20:50:18.991269] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:36:26.705 [2024-07-15 20:50:18.991329] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:36:26.705 [2024-07-15 20:50:18.991347] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:36:26.705 [2024-07-15 20:50:18.991364] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:36:26.705 [2024-07-15 20:50:18.991378] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:36:26.705 [2024-07-15 20:50:18.991413] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:27.640 20:50:19 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:27.641 20:50:19 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:27.641 20:50:19 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:36:27.641 20:50:19 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:27.641 malloc0 00:36:27.641 [2024-07-15 20:50:19.724464] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:36:27.641 [2024-07-15 20:50:19.740698] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:27.641 20:50:19 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:36:27.641 20:50:19 chaining -- bdev/chaining.sh@189 -- # bperfpid=1570275 00:36:27.641 20:50:19 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1570275 /var/tmp/bperf.sock 00:36:27.641 20:50:19 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@829 -- # '[' -z 1570275 ']' 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:36:27.641 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:27.641 20:50:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:27.641 [2024-07-15 20:50:19.813047] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:27.641 [2024-07-15 20:50:19.813114] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1570275 ] 00:36:27.641 [2024-07-15 20:50:19.943248] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:27.898 [2024-07-15 20:50:20.050698] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:28.465 20:50:20 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:28.465 20:50:20 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:28.465 20:50:20 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:36:28.465 20:50:20 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:36:29.032 [2024-07-15 20:50:21.105759] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:29.032 nvme0n1 00:36:29.032 true 00:36:29.032 crypto0 00:36:29.032 20:50:21 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:36:29.032 Running I/O for 5 seconds... 00:36:34.299 00:36:34.299 Latency(us) 00:36:34.299 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:34.299 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:36:34.299 Verification LBA range: start 0x0 length 0x2000 00:36:34.299 crypto0 : 5.03 6897.22 26.94 0.00 0.00 36982.08 4843.97 27126.21 00:36:34.299 =================================================================================================================== 00:36:34.299 Total : 6897.22 26.94 0.00 0.00 36982.08 4843.97 27126.21 00:36:34.299 0 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@205 -- # sequence=69336 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:34.299 20:50:26 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@206 -- # encrypt=34668 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:34.558 20:50:26 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@207 -- # decrypt=34668 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:34.816 20:50:27 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:36:35.078 20:50:27 chaining -- bdev/chaining.sh@208 -- # crc32c=69336 00:36:35.078 20:50:27 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:36:35.078 20:50:27 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:36:35.078 20:50:27 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:36:35.078 20:50:27 chaining -- bdev/chaining.sh@214 -- # killprocess 1570275 00:36:35.078 20:50:27 chaining -- common/autotest_common.sh@948 -- # '[' -z 1570275 ']' 00:36:35.078 20:50:27 chaining -- common/autotest_common.sh@952 -- # kill -0 1570275 00:36:35.078 20:50:27 chaining -- common/autotest_common.sh@953 -- # uname 00:36:35.078 20:50:27 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:35.078 20:50:27 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1570275 00:36:35.078 20:50:27 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:35.078 20:50:27 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:35.078 20:50:27 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1570275' 00:36:35.078 killing process with pid 1570275 00:36:35.078 20:50:27 chaining -- common/autotest_common.sh@967 -- # kill 1570275 00:36:35.078 Received shutdown signal, test time was about 5.000000 seconds 00:36:35.078 00:36:35.078 Latency(us) 00:36:35.078 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:35.078 =================================================================================================================== 00:36:35.078 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:35.078 20:50:27 chaining -- common/autotest_common.sh@972 -- # wait 1570275 00:36:35.352 20:50:27 chaining -- bdev/chaining.sh@219 -- # bperfpid=1571179 00:36:35.352 20:50:27 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:36:35.352 20:50:27 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1571179 /var/tmp/bperf.sock 00:36:35.352 20:50:27 chaining -- common/autotest_common.sh@829 -- # '[' -z 1571179 ']' 00:36:35.352 20:50:27 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:36:35.352 20:50:27 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:35.352 20:50:27 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:36:35.352 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:36:35.352 20:50:27 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:35.352 20:50:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:35.352 [2024-07-15 20:50:27.616537] Starting SPDK v24.09-pre git sha1 6c0846996 / DPDK 24.03.0 initialization... 00:36:35.352 [2024-07-15 20:50:27.616607] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1571179 ] 00:36:35.610 [2024-07-15 20:50:27.756438] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:35.610 [2024-07-15 20:50:27.860713] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:36.177 20:50:28 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:36.177 20:50:28 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:36.177 20:50:28 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:36:36.177 20:50:28 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:36:36.742 [2024-07-15 20:50:28.904077] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:36.742 nvme0n1 00:36:36.742 true 00:36:36.742 crypto0 00:36:36.742 20:50:28 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:36:36.742 Running I/O for 5 seconds... 00:36:42.003 00:36:42.003 Latency(us) 00:36:42.003 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:42.003 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:36:42.003 Verification LBA range: start 0x0 length 0x200 00:36:42.003 crypto0 : 5.01 1632.54 102.03 0.00 0.00 19220.89 1132.63 20515.62 00:36:42.003 =================================================================================================================== 00:36:42.003 Total : 1632.54 102.03 0.00 0.00 19220.89 1132.63 20515.62 00:36:42.003 0 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@233 -- # sequence=16348 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:42.003 20:50:34 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@234 -- # encrypt=8174 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:42.261 20:50:34 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:42.519 20:50:34 chaining -- bdev/chaining.sh@235 -- # decrypt=8174 00:36:42.519 20:50:34 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:36:42.519 20:50:34 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:36:42.519 20:50:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:42.519 20:50:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:42.519 20:50:34 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:36:42.520 20:50:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:42.520 20:50:34 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:36:42.520 20:50:34 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:42.520 20:50:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:36:42.520 20:50:34 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:42.778 20:50:35 chaining -- bdev/chaining.sh@236 -- # crc32c=16348 00:36:42.778 20:50:35 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:36:42.778 20:50:35 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:36:42.778 20:50:35 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:36:42.778 20:50:35 chaining -- bdev/chaining.sh@242 -- # killprocess 1571179 00:36:42.778 20:50:35 chaining -- common/autotest_common.sh@948 -- # '[' -z 1571179 ']' 00:36:42.778 20:50:35 chaining -- common/autotest_common.sh@952 -- # kill -0 1571179 00:36:42.778 20:50:35 chaining -- common/autotest_common.sh@953 -- # uname 00:36:42.778 20:50:35 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:42.778 20:50:35 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1571179 00:36:43.037 20:50:35 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:43.037 20:50:35 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:43.037 20:50:35 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1571179' 00:36:43.037 killing process with pid 1571179 00:36:43.037 20:50:35 chaining -- common/autotest_common.sh@967 -- # kill 1571179 00:36:43.037 Received shutdown signal, test time was about 5.000000 seconds 00:36:43.037 00:36:43.037 Latency(us) 00:36:43.037 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:43.037 =================================================================================================================== 00:36:43.037 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:43.037 20:50:35 chaining -- common/autotest_common.sh@972 -- # wait 1571179 00:36:43.037 20:50:35 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:36:43.037 20:50:35 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:36:43.037 20:50:35 chaining -- nvmf/common.sh@117 -- # sync 00:36:43.037 20:50:35 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:36:43.037 20:50:35 chaining -- nvmf/common.sh@120 -- # set +e 00:36:43.037 20:50:35 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:36:43.037 20:50:35 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:36:43.037 rmmod nvme_tcp 00:36:43.295 rmmod nvme_fabrics 00:36:43.295 rmmod nvme_keyring 00:36:43.295 20:50:35 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:36:43.295 20:50:35 chaining -- nvmf/common.sh@124 -- # set -e 00:36:43.295 20:50:35 chaining -- nvmf/common.sh@125 -- # return 0 00:36:43.295 20:50:35 chaining -- nvmf/common.sh@489 -- # '[' -n 1570084 ']' 00:36:43.295 20:50:35 chaining -- nvmf/common.sh@490 -- # killprocess 1570084 00:36:43.295 20:50:35 chaining -- common/autotest_common.sh@948 -- # '[' -z 1570084 ']' 00:36:43.295 20:50:35 chaining -- common/autotest_common.sh@952 -- # kill -0 1570084 00:36:43.295 20:50:35 chaining -- common/autotest_common.sh@953 -- # uname 00:36:43.295 20:50:35 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:43.295 20:50:35 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1570084 00:36:43.295 20:50:35 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:36:43.295 20:50:35 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:36:43.295 20:50:35 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1570084' 00:36:43.295 killing process with pid 1570084 00:36:43.295 20:50:35 chaining -- common/autotest_common.sh@967 -- # kill 1570084 00:36:43.295 20:50:35 chaining -- common/autotest_common.sh@972 -- # wait 1570084 00:36:43.554 20:50:35 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:36:43.554 20:50:35 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:36:43.554 20:50:35 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:36:43.554 20:50:35 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:36:43.554 20:50:35 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:36:43.554 20:50:35 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:43.554 20:50:35 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:43.554 20:50:35 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:43.554 20:50:35 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:36:43.554 20:50:35 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:36:43.554 00:36:43.554 real 0m46.925s 00:36:43.554 user 0m59.625s 00:36:43.554 sys 0m14.071s 00:36:43.554 20:50:35 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:43.554 20:50:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:43.554 ************************************ 00:36:43.554 END TEST chaining 00:36:43.554 ************************************ 00:36:43.813 20:50:35 -- common/autotest_common.sh@1142 -- # return 0 00:36:43.813 20:50:35 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:36:43.813 20:50:35 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:36:43.813 20:50:35 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:36:43.813 20:50:35 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:36:43.813 20:50:35 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:36:43.813 20:50:35 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:36:43.813 20:50:35 -- common/autotest_common.sh@722 -- # xtrace_disable 00:36:43.813 20:50:35 -- common/autotest_common.sh@10 -- # set +x 00:36:43.813 20:50:35 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:36:43.813 20:50:35 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:36:43.813 20:50:35 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:36:43.813 20:50:35 -- common/autotest_common.sh@10 -- # set +x 00:36:49.087 INFO: APP EXITING 00:36:49.087 INFO: killing all VMs 00:36:49.087 INFO: killing vhost app 00:36:49.087 INFO: EXIT DONE 00:36:52.373 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:52.373 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:52.373 Waiting for block devices as requested 00:36:52.373 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:36:52.373 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:52.373 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:52.373 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:52.632 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:52.632 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:52.632 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:52.890 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:52.890 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:52.890 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:53.149 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:53.149 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:53.149 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:53.409 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:53.409 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:53.409 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:53.668 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:57.858 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:57.858 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:57.858 Cleaning 00:36:57.858 Removing: /var/run/dpdk/spdk0/config 00:36:57.858 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:36:57.858 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:36:57.858 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:36:57.858 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:36:57.858 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:36:57.858 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:36:57.858 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:36:57.858 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:36:57.858 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:36:57.858 Removing: /var/run/dpdk/spdk0/hugepage_info 00:36:57.858 Removing: /dev/shm/nvmf_trace.0 00:36:57.858 Removing: /dev/shm/spdk_tgt_trace.pid1306680 00:36:57.858 Removing: /var/run/dpdk/spdk0 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1305785 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1306680 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1307328 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1308058 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1308252 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1309001 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1309182 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1309464 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1312380 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1314131 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1314513 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1314752 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1315065 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1315406 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1315601 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1315795 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1316031 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1316782 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1319480 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1319685 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1319927 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1320221 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1320319 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1320513 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1320748 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1320943 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1321148 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1321342 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1321587 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1321903 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1322098 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1322297 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1322495 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1322696 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1322998 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1323246 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1323454 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1323647 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1323847 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1324121 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1324405 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1324607 00:36:57.858 Removing: /var/run/dpdk/spdk_pid1324805 00:36:58.116 Removing: /var/run/dpdk/spdk_pid1325002 00:36:58.116 Removing: /var/run/dpdk/spdk_pid1325362 00:36:58.116 Removing: /var/run/dpdk/spdk_pid1325602 00:36:58.116 Removing: /var/run/dpdk/spdk_pid1325933 00:36:58.116 Removing: /var/run/dpdk/spdk_pid1326296 00:36:58.116 Removing: /var/run/dpdk/spdk_pid1326611 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1326878 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1327240 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1327611 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1327676 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1328097 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1328574 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1328949 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1329138 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1333454 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1335156 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1336851 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1337736 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1338940 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1339578 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1339698 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1339894 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1343692 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1344237 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1345136 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1345493 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1351068 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1353151 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1353962 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1358381 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1360007 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1360933 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1365536 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1368696 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1369666 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1379489 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1381806 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1382952 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1393708 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1395935 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1396898 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1407318 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1410751 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1411731 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1423311 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1426273 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1427418 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1439364 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1442143 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1443284 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1454717 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1458568 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1459638 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1460698 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1463748 00:36:58.117 Removing: /var/run/dpdk/spdk_pid1468930 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1471578 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1476630 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1480190 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1485598 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1488256 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1494395 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1496641 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1503082 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1505496 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1511783 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1514032 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1518544 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1518893 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1519252 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1519612 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1520049 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1520813 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1521487 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1521928 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1523696 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1525977 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1527753 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1529105 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1530818 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1532595 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1534201 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1535527 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1536203 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1536610 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1538779 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1540634 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1542479 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1543540 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1544613 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1545170 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1545340 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1545406 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1545608 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1545795 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1547034 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1548681 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1550626 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1551422 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1552141 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1552439 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1552528 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1552551 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1553492 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1554036 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1554543 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1556733 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1558592 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1560336 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1561352 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1562568 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1563110 00:36:58.375 Removing: /var/run/dpdk/spdk_pid1563216 00:36:58.670 Removing: /var/run/dpdk/spdk_pid1567202 00:36:58.670 Removing: /var/run/dpdk/spdk_pid1567412 00:36:58.670 Removing: /var/run/dpdk/spdk_pid1567516 00:36:58.670 Removing: /var/run/dpdk/spdk_pid1567645 00:36:58.670 Removing: /var/run/dpdk/spdk_pid1567851 00:36:58.670 Removing: /var/run/dpdk/spdk_pid1568062 00:36:58.670 Removing: /var/run/dpdk/spdk_pid1568943 00:36:58.670 Removing: /var/run/dpdk/spdk_pid1570275 00:36:58.670 Removing: /var/run/dpdk/spdk_pid1571179 00:36:58.670 Clean 00:36:58.670 20:50:50 -- common/autotest_common.sh@1451 -- # return 0 00:36:58.670 20:50:50 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:36:58.670 20:50:50 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:58.670 20:50:50 -- common/autotest_common.sh@10 -- # set +x 00:36:58.670 20:50:50 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:36:58.670 20:50:50 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:58.670 20:50:50 -- common/autotest_common.sh@10 -- # set +x 00:36:58.670 20:50:51 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:58.670 20:50:51 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:36:58.670 20:50:51 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:36:58.670 20:50:51 -- spdk/autotest.sh@391 -- # hash lcov 00:36:58.670 20:50:51 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:36:58.670 20:50:51 -- spdk/autotest.sh@393 -- # hostname 00:36:58.670 20:50:51 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:36:58.928 geninfo: WARNING: invalid characters removed from testname! 00:37:30.998 20:51:18 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:30.998 20:51:22 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:32.900 20:51:24 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:35.436 20:51:27 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:37.970 20:51:30 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:40.507 20:51:32 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:43.795 20:51:35 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:37:43.795 20:51:35 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:37:43.795 20:51:35 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:37:43.795 20:51:35 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:37:43.795 20:51:35 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:37:43.795 20:51:35 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:43.795 20:51:35 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:43.795 20:51:35 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:43.795 20:51:35 -- paths/export.sh@5 -- $ export PATH 00:37:43.795 20:51:35 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:43.795 20:51:35 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:43.795 20:51:35 -- common/autobuild_common.sh@444 -- $ date +%s 00:37:43.795 20:51:35 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721069495.XXXXXX 00:37:43.795 20:51:35 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721069495.JN4TKJ 00:37:43.795 20:51:35 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:37:43.795 20:51:35 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:37:43.795 20:51:35 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:37:43.795 20:51:35 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:37:43.795 20:51:35 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:37:43.795 20:51:35 -- common/autobuild_common.sh@460 -- $ get_config_params 00:37:43.795 20:51:35 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:37:43.795 20:51:35 -- common/autotest_common.sh@10 -- $ set +x 00:37:43.795 20:51:35 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:37:43.795 20:51:35 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:37:43.795 20:51:35 -- pm/common@17 -- $ local monitor 00:37:43.795 20:51:35 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:43.795 20:51:35 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:43.795 20:51:35 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:43.795 20:51:35 -- pm/common@21 -- $ date +%s 00:37:43.795 20:51:35 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:43.795 20:51:35 -- pm/common@21 -- $ date +%s 00:37:43.795 20:51:35 -- pm/common@25 -- $ sleep 1 00:37:43.795 20:51:35 -- pm/common@21 -- $ date +%s 00:37:43.795 20:51:35 -- pm/common@21 -- $ date +%s 00:37:43.795 20:51:35 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721069495 00:37:43.795 20:51:35 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721069495 00:37:43.795 20:51:35 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721069495 00:37:43.795 20:51:35 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721069495 00:37:43.795 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721069495_collect-vmstat.pm.log 00:37:43.795 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721069495_collect-cpu-load.pm.log 00:37:43.795 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721069495_collect-cpu-temp.pm.log 00:37:43.795 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721069495_collect-bmc-pm.bmc.pm.log 00:37:44.363 20:51:36 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:37:44.363 20:51:36 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:37:44.363 20:51:36 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:44.363 20:51:36 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:37:44.363 20:51:36 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:37:44.363 20:51:36 -- spdk/autopackage.sh@19 -- $ timing_finish 00:37:44.363 20:51:36 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:37:44.363 20:51:36 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:37:44.364 20:51:36 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:37:44.364 20:51:36 -- spdk/autopackage.sh@20 -- $ exit 0 00:37:44.364 20:51:36 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:37:44.364 20:51:36 -- pm/common@29 -- $ signal_monitor_resources TERM 00:37:44.364 20:51:36 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:37:44.364 20:51:36 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:44.364 20:51:36 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:37:44.364 20:51:36 -- pm/common@44 -- $ pid=1582648 00:37:44.364 20:51:36 -- pm/common@50 -- $ kill -TERM 1582648 00:37:44.364 20:51:36 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:44.364 20:51:36 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:37:44.364 20:51:36 -- pm/common@44 -- $ pid=1582650 00:37:44.364 20:51:36 -- pm/common@50 -- $ kill -TERM 1582650 00:37:44.364 20:51:36 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:44.364 20:51:36 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:37:44.364 20:51:36 -- pm/common@44 -- $ pid=1582652 00:37:44.364 20:51:36 -- pm/common@50 -- $ kill -TERM 1582652 00:37:44.364 20:51:36 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:44.364 20:51:36 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:37:44.364 20:51:36 -- pm/common@44 -- $ pid=1582674 00:37:44.364 20:51:36 -- pm/common@50 -- $ sudo -E kill -TERM 1582674 00:37:44.364 + [[ -n 1187903 ]] 00:37:44.364 + sudo kill 1187903 00:37:44.378 [Pipeline] } 00:37:44.396 [Pipeline] // stage 00:37:44.401 [Pipeline] } 00:37:44.417 [Pipeline] // timeout 00:37:44.422 [Pipeline] } 00:37:44.438 [Pipeline] // catchError 00:37:44.443 [Pipeline] } 00:37:44.459 [Pipeline] // wrap 00:37:44.465 [Pipeline] } 00:37:44.480 [Pipeline] // catchError 00:37:44.488 [Pipeline] stage 00:37:44.490 [Pipeline] { (Epilogue) 00:37:44.504 [Pipeline] catchError 00:37:44.505 [Pipeline] { 00:37:44.520 [Pipeline] echo 00:37:44.522 Cleanup processes 00:37:44.528 [Pipeline] sh 00:37:44.840 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:44.840 1582751 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:37:44.840 1582970 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:44.860 [Pipeline] sh 00:37:45.144 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:45.145 ++ grep -v 'sudo pgrep' 00:37:45.145 ++ awk '{print $1}' 00:37:45.145 + sudo kill -9 1582751 00:37:45.157 [Pipeline] sh 00:37:45.445 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:37:57.660 [Pipeline] sh 00:37:57.943 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:37:57.943 Artifacts sizes are good 00:37:57.956 [Pipeline] archiveArtifacts 00:37:57.963 Archiving artifacts 00:37:58.133 [Pipeline] sh 00:37:58.415 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:37:58.435 [Pipeline] cleanWs 00:37:58.444 [WS-CLEANUP] Deleting project workspace... 00:37:58.444 [WS-CLEANUP] Deferred wipeout is used... 00:37:58.451 [WS-CLEANUP] done 00:37:58.453 [Pipeline] } 00:37:58.472 [Pipeline] // catchError 00:37:58.483 [Pipeline] sh 00:37:58.763 + logger -p user.info -t JENKINS-CI 00:37:58.772 [Pipeline] } 00:37:58.789 [Pipeline] // stage 00:37:58.794 [Pipeline] } 00:37:58.810 [Pipeline] // node 00:37:58.814 [Pipeline] End of Pipeline 00:37:58.831 Finished: SUCCESS